Jabba the Hutt inspires human-style eyes for robots
In humanoid robots, artificial eyes are often referred to as doll eyes because they are made of glass or acrylic. This is a problem because their pupils do not respond to how human eyes live. Pupils are important because they emit visual cues that we subconsciously interpret as emotions and understanding.
For many scientists working in robots, reproducing human features is an important part of our work. To that end, my research is the first to create a robotic eye that responds to both light and sensation using artificial muscle. This will help them to interact with humans, who tend to be more comfortable with robotic features that mirror themselves. More lifelike robots allow humans to interact with more natural technology.
My work was inspired by meeting John Coppinger, one of Jabba the Hutt's engineers, from the 1983 film Star Wars: Return of the Jedi. Coppinger designed Jabba’s dilated eyes, and we talked about the difficulties in doing something similar to responding to a human eye due to the complexity of the mechanisms.
How it works
It took more than four years of laboratory experiments to find the right materials for robotic eyes to function like human eyes. You can see the results in the lead image. To reproduce the soft material of the human magazine, we 3D printed a colored gelatine label using a digital map of the human eye. Unlike glass and acrylic, gelatine is natural, highly flexible, and can maintain image. In the middle of each magazine are holes, which we call portals. One can respond to a camera to see the world, and the other to a photo sensor to measure light.
In order for the pupils to stretch and contract, as they do when people are happy or scared, we made artificial muscle from a stretched silicone ball, covered on all sides with graphene. When applied, they spray the silicone membrane together resulting in a shortening effect. Graphene is so good that one coating allows light to pass through like a human eye.
Artificial muscle is activated by generating a positive and negative range of static electricity on both sides of the membrane that tightens and relaxes the muscle with high and low voltage. Imagine pushing something on each side, then releasing. The surface area increases and decreases with the amount of pressure you apply. To help create human - like eyes we used flexible 3D printed material to hold the artificial muscles and sensors in place.
Another feature of the robot eye is that it can respond to both light and emotion at the same time. This is essential for the correct imitation of a human eye ability, but it has not been possible for robot eyes before.
A microprocessor alters the robot's eyes between emotional and light modes so that the eyes can respond just like human eyes: in humans, our pupils expand in response to light and pleasure, and make contract in darker places and when we are more unhappy.
When the robot is interacting with someone, a camera uses machine learning software to predict their emotional state from their faces. This assigns a robotic sensation such as joy or sadness and sends a message to the pupils to expand or become smaller accordingly. Similarly, in light mode, the pupil of the robot shrinks in darkness and shrinks in brighter conditions.
Why robots need human eyes
The advantage of creating more lifelike robots is that they allow humans to interact with more natural technology. This is important because for some people, a human - like interface is comfortable and can improve how people handle robots.
This should make it easier for robots to interact socially with humans, which can be useful for people living alone. Over time, robots will hopefully provide them with additional support and companionship. To learn more about how the eyes of robots have to respond, I experimented with people watching different videos. They then looked into artificial light at different levels of brightness. A headset on a headset monitored pupils who would record the pupil dilation and frequency of light, and this was used to closely monitor the robot's eyes.
In the last experiment, robotic eyes were inserted inside a reasonable humanoid robot and compared to a robot with conventional acrylic eyes. Humans were then tested to measure emotion and attention. Participants who noticed the eye dilation of the robots showed more emotion and attention.
These results show the benefits of these robotic eyes for humans to deal with them more naturally. This is important because otherwise humanoid robots appear unique. By repeating movements and subtle gestures, we increase our understanding and knowledge of robots. This includes things like lip synchronization, speech tonality, and facial expressions.
One day we may have robots that look so much like humans and are almost different from us, even when we look into their eyes.
The code, schemas and video footage of the robotic eyes are now available on Github for any engineer to promote the design and use it in their own projects.
Article by Carl Strathearn, Researcher, Computing, Napier University Edinburgh
This article is republished from The Conversation under a Creative Commons license. Read the original article.