{"id":140607,"date":"2022-06-14T17:03:37","date_gmt":"2022-06-14T22:03:37","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2022\/06\/google-engineer-believes-chatbot-ai-is-sentient"},"modified":"2022-06-14T17:03:37","modified_gmt":"2022-06-14T22:03:37","slug":"google-engineer-believes-chatbot-ai-is-sentient","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2022\/06\/google-engineer-believes-chatbot-ai-is-sentient","title":{"rendered":"Google engineer believes chatbot AI is sentient"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/google-engineer-believes-chatbot-ai-is-sentient.jpg\"><\/a><\/p>\n<p>One such competitor, LaMDA, is the work of Google\u2019s AI division. Built on Transformer \u2013 the company\u2019s open-source neural network architecture \u2013 it can produce non-generic, open-ended dialogue after training on 1.56 trillion words of multi-content, public data and web text. By contrast, a typical chatbot is dependent on topic-specific datasets and has a limited conversation flow. LaMDA has 137 billion parameters, which can be thought of as the individual \u201csynapses\u201d combining to form the AI.<\/p>\n<p>The sheer scale and complexity of models like LaMDA is leading some experts to ask profound questions about the nature of AI. In February, the Chief Scientist and Co-Founder of OpenAI, one of the leading research labs for artificial intelligence, claimed that the latest generation of neural networks are now large enough to be \u201c<a href=\"https:\/\/www.futuretimeline.net\/blog\/2022\/02\/17-ai-may-be-slightly-conscious.htm\" target=\"_blank\">slightly conscious<\/a>\u201d.<\/p>\n<p>This month, another expert in machine learning has spoken out. Blake Lemoine, Senior Software Engineer at Google, believes that a form of self-awareness might be starting to emerge from the billions of connected parameters.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>One such competitor, LaMDA, is the work of Google\u2019s AI division. Built on Transformer \u2013 the company\u2019s open-source neural network architecture \u2013 it can produce non-generic, open-ended dialogue after training on 1.56 trillion words of multi-content, public data and web text. By contrast, a typical chatbot is dependent on topic-specific datasets and has a limited [\u2026]<\/p>\n","protected":false},"author":566,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-140607","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/140607","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/566"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=140607"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/140607\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=140607"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=140607"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=140607"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}