{"id":17722,"date":"2015-09-23T06:14:59","date_gmt":"2015-09-23T13:14:59","guid":{"rendered":"http:\/\/lifeboat.com\/blog\/?p=17722"},"modified":"2015-09-23T06:14:59","modified_gmt":"2015-09-23T13:14:59","slug":"the-emotional-era-of-artificial-intelligence","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2015\/09\/the-emotional-era-of-artificial-intelligence","title":{"rendered":"The Emotional Era of Artificial Intelligence"},"content":{"rendered":"<p>Have you hugged or told someone that you love them today? Maybe it wasn\u2019t someone \u2014 maybe it was your smartphone that you gave an extra squeeze or gave an extra pat as you slipped it into your pocket. Humans have become increasingly invested in their devices, and a new era of emotional attachment to our devices and other AI seems to be upon us. But how does this work itself out on the other end \u2014 will or could AI ever respond to humans in an emotional fashion?<\/p>\n<h2>Communication Sparks Emotional Response<\/h2>\n<p>AI is broad, and clearly not all AI are meant to give and receive in an emotional capacity. Humans seem prone to respond to features that are similar to its own species, or to those to which it can relate to in some sort of communicative way. Most \u201cemotional\u201d or responsive algorithm-based capabilities have been programmed into robots that are in a humanoid \u2013 or at least a mammal-like \u2013 form.<\/p>\n<p>Think androids in<a href=\"http:\/\/blogs.wsj.com\/japanrealtime\/2015\/04\/16\/toshiba-humanoid-robot-to-debut-in-tokyo-department-store\/\"> customer-service<\/a>, entertainment, or companion-type roles. There are also robots like <a href=\"http:\/\/www.parorobots.com\/\">PARO<\/a>, the baby harbor seal used for therapeutic interaction with those in assisted living and hospital environments.<\/p>\n<p>In a <a href=\"http:\/\/robotic.media.mit.edu\/wp-content\/uploads\/sites\/14\/2015\/01\/Breazeal-IJHCS-03.pdf\">2003 paper<\/a> published through the International Journal of Human-Computer Studies, Cynthia Breazeal quotes a study by Reeves and Nass (1996), whose research shows humans (whether computer experts, lay people, or computer critics) generally treat computers as they might treat other people.<\/p>\n<p>Breazeal goes on to state that humanoid robots (and animated software agents) are particularly relevant, as a similar morphology promotes an intuitive bond based on similar communication modes, such as facial expression, body posture, gesture, gaze direction, and voice.<\/p>\n<h2>An Emotional Model for AI<\/h2>\n<p>This in and of itself may not be a complete revelation, but how you get a robot to accomplish such emotional responses is far more complicated. When the <a href=\"http:\/\/www.hansonrobotics.com\/\">Hanson Robotics\u2019<\/a> team programs responses, a key objective is to build robots that are expressive and lifelike so that people can interact and feel comfortable with the emotional responses that they are receiving from a robot.<\/p>\n<p>In the realm of emotions, there is a difference between robot \u2018responses\u2019 and robot \u2018propensities\u2019. Stephan Vladimir Bugaj, Creative Director at Hanson Robotics, separated the two during <a href=\"http:\/\/techemergence.com\/how-humans-do-and-will-relate-to-robots-with-stephan-vladimir-bugaj\/\">an interview with TechEmergence<\/a>. \u201cPropensities are much more interesting and are definitely more of the direction we\u2019re going in the immediate long-term\u201d, he says.<\/p>\n<p>\u201cAn emotional model for a robot would be more along the lines of weighted sets of possible response spaces that the robot can go into based on a stimulus and choose a means of expression within that emotional space based on a bunch of factors.\u201d In other words, a robot with propensities would consider a set of questions, such as \u201cWhat do I think of the person? How did it act in the last minute? How am I feeling today?\u201d. This how most humans function through reason, though it happens so habitually and quickly in the subconscious that we are hardly aware of the process.<\/p>\n<p>Context of immediate stimulus would provide an emotional frame, allowing a robot to have a more complex response to each stimulus. The use of short-term memory would help the robot build a longer-term emotional model. \u201cYou think of it as layers, you can think of it as interconnected networks of weighted responses\u2026as collections of neurons, there\u2019s a lot of different ways of looking at it, but it basically comes down to stages of filtering and considering stimuli, starting with the input filter at the perceptual level.\u201d<\/p>\n<p>Similar to a human being, robots could have more than one response to a stimulus. An initial reaction or reflex might quickly give way to a more \u201cconsidered response\u201d, cause by stored and shared information in a neural-like network. Stephan describes a hypothetical scene in which a friend enters a room and begins taking swings at his or her friend. At first, the friend who is on the defense might react by immediately assuming a fighting stance; however, it might only take a few seconds for him or to realize that the other person is actually just \u201chorsing around\u201d and being a bit of an antagonist for sport.<\/p>\n<p>This string of events provides a simple way to visualize emotional stages of reaction. Perception, context, and analysis all play a part in the responses of a complex entity, including advanced robots. Robots with such potential complex emotional models seem different from AI entities programmed to respond to human emotions.<\/p>\n<h2>The Beginnings of Responsive Robots<\/h2>\n<p>These AI don\u2019t necessarily need to take a human-like form (I\u2019m thinking of the movie <em>Her<\/em>), as long as they can communicate in a language that humans understand. In the past few years, innovators have started to hit the IndieGogo market with domestic social robots such as <a href=\"https:\/\/www.jibo.com\/\">Jibo<\/a> and <a href=\"http:\/\/emospark.com\/\">EmoSPARK<\/a>, meant to enhance human wellbeing through intelligent response capabilities.<\/p>\n<p>Patrick Levy Rosenthal, founder of EmoSpace, envisioned a device that connects to the various electronic objects in our homes, able to adjust their function to positively affect our emotional state. \u201cFor the last 20 years, I believe that robotics and artificial intelligence failed humans\u2026we still see them as a bunch of silicon\u2026 we know that they don\u2019t understand what we feel.\u201d<\/p>\n<p>Rosenthal set out to change this perception with EmoSPARK, a cube-like AI that calibrates with other objects in the user\u2019s home, such as an mp3 music player. The device, according to Rosenthal, tracks over 180 points on a person\u2019s face, as well as the relation between those points \u2013 if you\u2019re smiling, your lips will be stretched and eyes more narrow. The device also detects movement and voice tonality for reading emotional cues. It can then respond to those cues with spoken prompts and suggestions for improving mood \u2013 for example, asking if its human user needs to hear a joke or a favorite song; it can also respond to and process spoken commands.<\/p>\n<p>While robots that respond to humans\u2019 emotionally-based states and requests may soon be available to the masses, robots that have their own emotional models \u2013 that can \u201claugh and cry\u201d autonomously, so to speak \u2013 are still out of reach, for the time being.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Have you hugged or told someone that you love them today? Maybe it wasn\u2019t someone \u2014 maybe it was your smartphone that you gave an extra squeeze or gave an extra pat as you slipped it into your pocket. Humans have become increasingly invested in their devices, and a new era of emotional attachment to [\u2026]<\/p>\n","protected":false},"author":274,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1522,6],"tags":[],"class_list":["post-17722","post","type-post","status-publish","format-standard","hentry","category-innovation","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/17722","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/274"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=17722"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/17722\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=17722"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=17722"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=17722"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}