{"id":219055,"date":"2025-07-30T12:12:33","date_gmt":"2025-07-30T17:12:33","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2025\/07\/robot-know-thyself-new-vision-based-system-teaches-machines-to-understand-their-bodies"},"modified":"2025-07-30T12:12:33","modified_gmt":"2025-07-30T17:12:33","slug":"robot-know-thyself-new-vision-based-system-teaches-machines-to-understand-their-bodies","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2025\/07\/robot-know-thyself-new-vision-based-system-teaches-machines-to-understand-their-bodies","title":{"rendered":"Robot, know thyself: New vision-based system teaches machines to understand their bodies"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/robot-know-thyself-new-vision-based-system-teaches-machines-to-understand-their-bodies.jpg\"><\/a><\/p>\n<p>In an office at MIT\u2019s Computer Science and Artificial Intelligence Laboratory (CSAIL), a soft robotic hand carefully curls its fingers to grasp a small object. The intriguing part isn\u2019t the mechanical design or embedded sensors\u2014in fact, the hand contains none. Instead, the entire system relies on a single camera that watches the robot\u2019s movements and uses that visual data to control it.<\/p>\n<p>This capability comes from a new system CSAIL scientists developed, offering a different perspective on robotic control. Rather than using hand-designed models or complex sensor arrays, it allows robots to learn how their bodies respond to control commands, solely through vision. The approach, called Neural Jacobian Fields (NJF), gives robots a kind of bodily self-awareness.<\/p>\n<p>A <a href=\"https:\/\/www.nature.com\/articles\/s41586-025-09170-0\" target=\"_blank\">paper about the work<\/a> was published in Nature.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In an office at MIT\u2019s Computer Science and Artificial Intelligence Laboratory (CSAIL), a soft robotic hand carefully curls its fingers to grasp a small object. The intriguing part isn\u2019t the mechanical design or embedded sensors\u2014in fact, the hand contains none. Instead, the entire system relies on a single camera that watches the robot\u2019s movements and [\u2026]<\/p>\n","protected":false},"author":396,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-219055","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/219055","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/396"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=219055"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/219055\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=219055"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=219055"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=219055"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}