No one is safe.
Category: robotics/AI – Page 2519
Have you hugged or told someone that you love them today? Maybe it wasn’t someone — maybe it was your smartphone that you gave an extra squeeze or gave an extra pat as you slipped it into your pocket. Humans have become increasingly invested in their devices, and a new era of emotional attachment to our devices and other AI seems to be upon us. But how does this work itself out on the other end — will or could AI ever respond to humans in an emotional fashion?
Communication Sparks Emotional Response
AI is broad, and clearly not all AI are meant to give and receive in an emotional capacity. Humans seem prone to respond to features that are similar to its own species, or to those to which it can relate to in some sort of communicative way. Most “emotional” or responsive algorithm-based capabilities have been programmed into robots that are in a humanoid – or at least a mammal-like – form.
Think androids in customer-service, entertainment, or companion-type roles. There are also robots like PARO, the baby harbor seal used for therapeutic interaction with those in assisted living and hospital environments.
In a 2003 paper published through the International Journal of Human-Computer Studies, Cynthia Breazeal quotes a study by Reeves and Nass (1996), whose research shows humans (whether computer experts, lay people, or computer critics) generally treat computers as they might treat other people.
Breazeal goes on to state that humanoid robots (and animated software agents) are particularly relevant, as a similar morphology promotes an intuitive bond based on similar communication modes, such as facial expression, body posture, gesture, gaze direction, and voice.
An Emotional Model for AI
This in and of itself may not be a complete revelation, but how you get a robot to accomplish such emotional responses is far more complicated. When the Hanson Robotics’ team programs responses, a key objective is to build robots that are expressive and lifelike so that people can interact and feel comfortable with the emotional responses that they are receiving from a robot.
In the realm of emotions, there is a difference between robot ‘responses’ and robot ‘propensities’. Stephan Vladimir Bugaj, Creative Director at Hanson Robotics, separated the two during an interview with TechEmergence. “Propensities are much more interesting and are definitely more of the direction we’re going in the immediate long-term”, he says.
“An emotional model for a robot would be more along the lines of weighted sets of possible response spaces that the robot can go into based on a stimulus and choose a means of expression within that emotional space based on a bunch of factors.” In other words, a robot with propensities would consider a set of questions, such as “What do I think of the person? How did it act in the last minute? How am I feeling today?”. This how most humans function through reason, though it happens so habitually and quickly in the subconscious that we are hardly aware of the process.
Context of immediate stimulus would provide an emotional frame, allowing a robot to have a more complex response to each stimulus. The use of short-term memory would help the robot build a longer-term emotional model. “You think of it as layers, you can think of it as interconnected networks of weighted responses…as collections of neurons, there’s a lot of different ways of looking at it, but it basically comes down to stages of filtering and considering stimuli, starting with the input filter at the perceptual level.”
Similar to a human being, robots could have more than one response to a stimulus. An initial reaction or reflex might quickly give way to a more “considered response”, cause by stored and shared information in a neural-like network. Stephan describes a hypothetical scene in which a friend enters a room and begins taking swings at his or her friend. At first, the friend who is on the defense might react by immediately assuming a fighting stance; however, it might only take a few seconds for him or to realize that the other person is actually just “horsing around” and being a bit of an antagonist for sport.
This string of events provides a simple way to visualize emotional stages of reaction. Perception, context, and analysis all play a part in the responses of a complex entity, including advanced robots. Robots with such potential complex emotional models seem different from AI entities programmed to respond to human emotions.
The Beginnings of Responsive Robots
These AI don’t necessarily need to take a human-like form (I’m thinking of the movie Her), as long as they can communicate in a language that humans understand. In the past few years, innovators have started to hit the IndieGogo market with domestic social robots such as Jibo and EmoSPARK, meant to enhance human wellbeing through intelligent response capabilities.
Patrick Levy Rosenthal, founder of EmoSpace, envisioned a device that connects to the various electronic objects in our homes, able to adjust their function to positively affect our emotional state. “For the last 20 years, I believe that robotics and artificial intelligence failed humans…we still see them as a bunch of silicon… we know that they don’t understand what we feel.”
Rosenthal set out to change this perception with EmoSPARK, a cube-like AI that calibrates with other objects in the user’s home, such as an mp3 music player. The device, according to Rosenthal, tracks over 180 points on a person’s face, as well as the relation between those points – if you’re smiling, your lips will be stretched and eyes more narrow. The device also detects movement and voice tonality for reading emotional cues. It can then respond to those cues with spoken prompts and suggestions for improving mood – for example, asking if its human user needs to hear a joke or a favorite song; it can also respond to and process spoken commands.
While robots that respond to humans’ emotionally-based states and requests may soon be available to the masses, robots that have their own emotional models – that can “laugh and cry” autonomously, so to speak – are still out of reach, for the time being.
Future is here!
Posted in futurism, robotics/AI
The latest version of a walking, quadruped battlefield robot from Boston Dynamics, the military robotics maker owned by Google X, was tested by U.S. Marines last week.
Spot weighs about 70kgs, is electrically operated and walks on four hydraulically-actuated legs. It’s controlled via wireless by an operator who can be up to 500 meters away.
It underwent trials and testing at Marine Corps Base Quantico in Virginia as part of evaluations by the Marines on future military uses of robotic technology. In a series of missions, it was evaluated in different terrains including hills, woodlands and urban areas.
The WEpod will be the first self-driving electric shuttle to run in regular traffic, and take bookings via a dedicated app.
Making spaceships and electric supercars isn’t enough for Elon Musk. Meghan Daum meets the entrepreneur who wants to save the world.
The name sounds like a men’s cologne. Or a type of ox. It sounds possibly made up. But then, so much about Elon Musk seems the creation of a fiction writer—and not necessarily one committed to realism. At 44, Musk is both superstar entrepreneur and mad scientist. Sixteen years after cofounding a company called X.com that would, following a merger, go on to become PayPal, he’s launched the electric carmaker Tesla Motors and the aerospace manufacturer SpaceX, which are among the most closely watched—some would say obsessed-over—companies in the world. He has been compared to the Christian Grey character in the Fifty Shades of Grey movie, though not as often as he’s been called “the real Tony Stark,” referring to the playboy tech entrepreneur whose alter ego, Iron Man, rescues the universe from various manifestations of evil.
The Iron Man comparison is, strangely, as apt as it is hyperbolic, since Musk has the boyish air of a nascent superhero and says his ultimate aim is to save humanity from what he sees as its eventual and unavoidable demise—from any number of causes, carbon consumption high among them. (As it happens, he met with Robert Downey, Jr., to discuss the Tony Stark role, and his factory doubled as the villain’s hideaway in Iron Man 2.) To this end he’s building his own rockets, envisioning a future in which we colonize Mars, funding research aimed at keeping artificial intelligence beneficial to humanity, and making lithium-ion electric batteries that might, one day, put the internal-combustion engine out to pasture.
The battlefield can be one of the most useful places for robots. And now, the US Marines are testing out Spot, a robo dog built by Boston Dynamics to see how helpful the ‘bot could be in combat.
Remember Big Dog, also from Google-owned robotics company Boston Dynamics? Well, Spot is a tinier, more agile iteration: At 160 pounds, it’s hydraulically actuated with a sensor on its noggin that aids in navigation. It’s controlled by a laptop-connected game controller, which a hidden operator can use up to 1,600 feet away. The four-legged all-terrain robo pup was revealed in February. Robots in combat aren’t new, but Spot signals a quieter, leaner alternative that hints at the strides made in this arena.
The U.S. military doesn’t just build big, scary tanks and giant warplanes; it’s also interested in teeny, tiny stuff. The Pentagon’s latest research project aims to improve today’s technologies by shrinking them down to microscopic size.
The recently launched Atoms to Product (A2P) program aims to develop atom-size materials to build state-of-the-art military and consumer products. These tiny manufacturing methods would work at scales 100,000 times smaller than those currently being used to build new technologies, according to the Defense Advanced Research Projects Agency, or DARPA.
The tiny, high-tech materials of the future could be used to build things like hummingbird-size drones and super-accurate (and super-small) atomic clocks — two projects already spearheaded by DARPA. [Humanoid Robots to Flying Cars: 10 Coolest DARPA Projects].
(Phys.org) —Wi-Fi makes all kinds of things possible. We can send and receive messages, make phone calls, browse the Internet, even play games with people who are miles away, all without the cords and wires to tie us down. At UC Santa Barbara, researchers are now using this versatile, everyday signal to do something different and powerful: looking through solid walls and seeing every square inch of what’s on the other side. Built into robots, the technology has far-reaching possibilities.
“This is an exciting time to be doing this kind of research,” said Yasamin Mostofi, professor of electrical and computer engineering at UCSB. For the past few years, she and her team have been busy realizing this X-ray vision, enabling robots to see objects and humans behind thick walls through the use of radio frequency signals. The patented technology allows users to see the space on the other side and identify not only the presence of occluded objects, but also their position and geometry, without any prior knowledge of the area. Additionally, it has the potential to classify the material type of each occluded object such as human, metal or wood.
The combination of imaging technology and automated mobility can make these robots useful in situations where human access is difficult or risky, and the ability to determine what is in a given occluded area is important, such as search and rescue operations for natural or man-made disasters.
Soldiers practically inhabiting the mechanical bodies of androids, who will take the humans’ place on the battlefield. Or sophisticated tech that spots a powerful laser ray, then stops it from obliterating its target.
If you’ve got Danger Room’s taste in movies, you’ve probably seen both ideas on the big screen. Now Darpa, the Pentagon’s far-out research arm, wants to bring ’em into the real world.
In the agency’s $2.8 billion budget for 2013, unveiled on Monday, they’ve allotted $7 million for a project titled “Avatar.” The project’s ultimate goal, not surprisingly, sounds a lot like the plot of the same-named (but much more expensive) flick.