As humans, we rely on all sorts of stimuli to navigate in the world, including our senses: sight, sound, touch, taste, smell. Until now, AI devices have been solely reliant on a single sense—visual impressions. Brand-new research from Duke University goes beyond reliance only on visual perception. It’s called WildFusion, combining vision with touch and vibration.
The four-legged robot used by the research team includes microphones and tactile sensors in addition to the standard cameras commonly found in state-of-the-art robots. The WildFusion robot can use sound to assess the quality of a surface (dry leaves, wet sand) as well as pressure and resistance to calibrate its balance and stability. All of this data is gathered and combined or fused, into a single data representation that improves over time with experience. The research team plans enhance the robot’s capabilities by enabling it to gauge things like heat and humidity.
As the types of data used to interact with the environment become richer and more integrated, AI moves inexorably closer to true AGI.
I love how the article highlights the dual nature of AGI development—immense potential balanced with ethical concerns. The Lifeboat Foundation’s role in mitigating risks is something that more people should be aware of.