{"id":171082,"date":"2023-09-01T16:22:59","date_gmt":"2023-09-01T21:22:59","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2023\/09\/deepminds-chatgpt-like-brain-for-robots-lets-them-learn-from-the-internet"},"modified":"2023-09-01T16:22:59","modified_gmt":"2023-09-01T21:22:59","slug":"deepminds-chatgpt-like-brain-for-robots-lets-them-learn-from-the-internet","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2023\/09\/deepminds-chatgpt-like-brain-for-robots-lets-them-learn-from-the-internet","title":{"rendered":"DeepMind\u2019s ChatGPT-Like Brain for Robots Lets Them Learn From the Internet"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/deepminds-chatgpt-like-brain-for-robots-lets-them-learn-from-the-internet2.jpg\"><\/a><\/p>\n<p>Examples the team gives include choosing an object to use as a hammer when there\u2019s no hammer available (the robot chooses a rock) and picking the best drink for a tired person (the robot chooses an energy drink).<\/p>\n<p>\u201cRT-2 shows improved generalization capabilities and semantic and visual understanding beyond the robotic data it was exposed to,\u201d the researchers wrote in a Google <a href=\"https:\/\/blog.google\/technology\/ai\/google-deepmind-rt2-robotics-vla-model\/\">blog post<\/a>. \u201cThis includes interpreting new commands and responding to user commands by performing rudimentary reasoning, such as reasoning about object categories or high-level descriptions.\u201d<\/p>\n<p>The dream of <a href=\"https:\/\/singularityhub.com\/2023\/05\/22\/silicon-valley-is-reviving-the-dream-of-general-purpose-humanoid-robots\/\">general-purpose robots<\/a> that can help humans with whatever may come up\u2014whether in a home, a commercial setting, or an industrial setting\u2014won\u2019t be achievable until robots can learn on the go. What seems like the most basic instinct to us is, for robots, a complex combination of understanding context, being able to reason through it, and taking actions to solve problems that weren\u2019t anticipated to pop up. Programming them to react appropriately to a variety of unplanned scenarios is impossible, so they need to be able to generalize and learn from experience, just like humans do.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Examples the team gives include choosing an object to use as a hammer when there\u2019s no hammer available (the robot chooses a rock) and picking the best drink for a tired person (the robot chooses an energy drink). \u201cRT-2 shows improved generalization capabilities and semantic and visual understanding beyond the robotic data it was exposed [\u2026]<\/p>\n","protected":false},"author":367,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[418,6],"tags":[],"class_list":["post-171082","post","type-post","status-publish","format-standard","hentry","category-internet","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/171082","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/367"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=171082"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/171082\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=171082"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=171082"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=171082"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}