{"id":216825,"date":"2025-06-28T21:15:56","date_gmt":"2025-06-29T02:15:56","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2025\/06\/these-two-game-changing-breakthroughs-advance-us-toward-artificial-general-intelligence"},"modified":"2025-06-28T21:15:56","modified_gmt":"2025-06-29T02:15:56","slug":"these-two-game-changing-breakthroughs-advance-us-toward-artificial-general-intelligence","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2025\/06\/these-two-game-changing-breakthroughs-advance-us-toward-artificial-general-intelligence","title":{"rendered":"These two game-changing breakthroughs advance us toward artificial general intelligence"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/these-two-game-changing-breakthroughs-advance-us-toward-artificial-general-intelligence2.jpg\"><\/a><\/p>\n<p>As humans, we rely on all sorts of stimuli to navigate in the world, including our senses: sight, sound, touch, taste, smell. Until now, AI devices have been solely reliant on a single sense\u2014visual impressions. Brand-new <a href=\"https:\/\/pratt.duke.edu\/news\/wildfusion-robot-navigation\/\" target=\"_blank\" rel=\"noreferrer noopener\">research from Duke University<\/a> goes beyond reliance only on visual perception. It\u2019s called <em>WildFusion<\/em>, combining vision with touch and vibration.<\/p>\n<p>The four-legged robot used by the research team includes microphones and tactile sensors in addition to the standard cameras commonly found in state-of-the-art robots. The <em>WildFusion<\/em> robot can use sound to assess the quality of a surface (dry leaves, wet sand) as well as pressure and resistance to calibrate its balance and stability. All of this data is gathered and combined or <em>fused<\/em>, into a single data representation that improves over time with experience. The research team plans enhance the robot\u2019s capabilities by enabling it to gauge things like heat and humidity.<\/p>\n<p>As the types of data used to interact with the environment become richer and more integrated, AI moves inexorably closer to true AGI.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>As humans, we rely on all sorts of stimuli to navigate in the world, including our senses: sight, sound, touch, taste, smell. Until now, AI devices have been solely reliant on a single sense\u2014visual impressions. Brand-new research from Duke University goes beyond reliance only on visual perception. It\u2019s called WildFusion, combining vision with touch and [\u2026]<\/p>\n","protected":false},"author":718,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1522,6],"tags":[],"class_list":["post-216825","post","type-post","status-publish","format-standard","hentry","category-innovation","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/216825","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/718"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=216825"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/216825\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=216825"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=216825"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=216825"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}