Artificial emotion: is giving robots feelings a good idea?


Robots are being built for all kinds of reasons, from becoming stand-in astronauts on the International Space Station to the friendliest retail assistantsaround, and because many will work alongside and communicate with us, they need to understand, and even possess, human-like qualities. 

But for robots to truly understand us, interact with us, and get us to buy things from them, do they need to have emotions too?

Robots that can understand emotions, and even feel for themselves, have become a popular trope within science fiction – Data exploring his inability to feel emotions in Star Trek: The Next Generation, Ava manipulating human emotions in Ex Machina, and Samantha the AI software in Her breaking a man’s heart after she loves and leaves him.

We may still be a long way from creating robots with a nuanced grasp of human emotion, but it’s not that difficult to imagine how it could be possible. 

After all, more and more of us are forming an emotional connection to Alexa and Siri, and the capabilities of such AIs are limited right now to simple interactions, like telling us what the weather’s like and switching our lights on and off. 

Which begs the question: how much deeper could that emotional connection go with a robot that not only looks like a human but feels like a human too?

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from - Youtube
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google