Eyes Are the Window to a Robot’s Soul
SEPTEMBER 2019, ONEZERO by MEDIUM
How a robot’s eyes are developed could provide a shortcut around the uncanny valley.
Marty works at a grocery store and has a very specialized job scanning the aisles for spills and hazards. He’s one of 500 Martys at Stop and Shop and Giant Food stores across the U.S., a fleet of robot assistants that prompt questions from curious customers: Does each robot really cost $35,000? Why can’t it clean the hazards itself? And why does it have to have horrifying googly eyes on it?
In retail and the care sector and as electronic pets, robots are becoming part of everyday life, and because designers want us to feel comfortable and connected to them, the eyes are often a critical feature. But where is the line between eyes that are too creepily superficial (sorry, Marty) and eyes that are too eerily realistic? Should robots in care homes have different eyes than in-store robots? And why do robots — which are able to sense their environment using any number of methods — need eyes anyway?
“A lot of research and effort has gone into the design of eyes for robots,” says Michal Luria, a human-robot interaction design researcher at Carnegie Mellon University. “The eyes are the ‘point of entry,’ the thing users first look at when interacting with a robot, just as when we interact with other people. Eyes naturally draw our gaze.”