{"id":140643,"date":"2022-06-15T15:40:14","date_gmt":"2022-06-15T20:40:14","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2022\/06\/lamda-and-the-sentient-ai-trap"},"modified":"2022-06-15T15:40:14","modified_gmt":"2022-06-15T20:40:14","slug":"lamda-and-the-sentient-ai-trap","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2022\/06\/lamda-and-the-sentient-ai-trap","title":{"rendered":"LaMDA and the Sentient AI Trap"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/lamda-and-the-sentient-ai-trap3.jpg\"><\/a><\/p>\n<p>\u201cQuite a large gap exists between the current narrative of AI and what it can actually do,\u201d says Giada Pistilli, an ethicist at Hugging Face, a startup focused on language models. \u201cThis narrative provokes fear, amazement, and excitement simultaneously, but it is mainly based on lies to sell products and take advantage of the hype.\u201d<\/p>\n<p>The consequence of speculation about sentient AI, she says, is an increased willingness to make claims based on subjective impression instead of scientific rigor and proof. It distracts from \u201ccountless ethical and social justice questions\u201d that AI systems pose. While every researcher has the freedom to research what they want, she says, \u201cI just fear that focusing on this subject makes us forget what is happening while looking at the moon.\u201d<\/p>\n<p>What Lemoire experienced is an example of what author and futurist David Brin has called the \u201crobot empathy crisis.\u201d At an AI conference in San Francisco in 2017, Brin predicted that in three to five years, people would claim AI systems were sentient and insist that they had rights. Back then, he thought those appeals would come from a virtual agent that took the appearance of a woman or child to maximize human empathic response, not \u201csome guy at Google,\u201d he says.<\/p>\n<hr>\n<p>Arguments over whether Google\u2019s large language model has a soul distract from the real-world problems that plague artificial intelligence.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>\u201cQuite a large gap exists between the current narrative of AI and what it can actually do,\u201d says Giada Pistilli, an ethicist at Hugging Face, a startup focused on language models. \u201cThis narrative provokes fear, amazement, and excitement simultaneously, but it is mainly based on lies to sell products and take advantage of the hype.\u201d [\u2026]<\/p>\n","protected":false},"author":672,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6,8],"tags":[],"class_list":["post-140643","post","type-post","status-publish","format-standard","hentry","category-robotics-ai","category-space"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/140643","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/672"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=140643"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/140643\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=140643"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=140643"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=140643"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}