Eyes Are the Window to a Robot’s Soul


How a robot’s eyes are developed could provide a shortcut around the uncanny valley.

Marty works at a grocery store and has a very specialized job scanning the aisles for spills and hazards. He’s one of 500 Martys at Stop and Shop and Giant Food stores across the U.S., a fleet of robot assistants that prompt questions from curious customers: Does each robot really cost $35,000? Why can’t it clean the hazards itself? And why does it have to have horrifying googly eyes on it?

In retail and the care sector and as electronic pets, robots are becoming part of everyday life, and because designers want us to feel comfortable and connected to them, the eyes are often a critical feature. But where is the line between eyes that are too creepily superficial (sorry, Marty) and eyes that are too eerily realistic? Should robots in care homes have different eyes than in-store robots? And why do robots — which are able to sense their environment using any number of methods — need eyes anyway?

“A lot of research and effort has gone into the design of eyes for robots,” says Michal Luria, a human-robot interaction design researcher at Carnegie Mellon University. “The eyes are the ‘point of entry,’ the thing users first look at when interacting with a robot, just as when we interact with other people. Eyes naturally draw our gaze.”

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from - Youtube
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google