{"id":189907,"date":"2024-05-23T07:23:27","date_gmt":"2024-05-23T12:23:27","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2024\/05\/no-todays-ai-isnt-sentient-heres-how-we-know"},"modified":"2024-05-23T07:23:27","modified_gmt":"2024-05-23T12:23:27","slug":"no-todays-ai-isnt-sentient-heres-how-we-know","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2024\/05\/no-todays-ai-isnt-sentient-heres-how-we-know","title":{"rendered":"No, Today\u2019s AI Isn\u2019t Sentient. Here\u2019s How We Know"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/no-todays-ai-isnt-sentient-heres-how-we-know2.jpg\"><\/a><\/p>\n<p>All sensations\u2014hunger, feeling pain, seeing red, falling in love\u2014are the result of physiological states that an LLM simply doesn\u2019t have. Consequently we know that an LLM cannot have <i>subjective experiences<\/i> of those states. In other words, it cannot be sentient.<\/p>\n<p>An LLM is a mathematical model coded on silicon chips. It is not an embodied being like humans. It does not have a \u201clife\u201d that needs to eat, drink, reproduce, experience emotion, get sick, and eventually die.<\/p>\n<p>It is important to understand the profound difference between how humans generate sequences of words and how an LLM generates those same sequences. When I say \u201cI am hungry,\u201d I am reporting on my sensed physiological states. When an LLM generates the sequence \u201cI am hungry,\u201d it is simply generating the most probable completion of the sequence of words in its current prompt. It is doing exactly the same thing as when, with a different prompt, it generates \u201cI am <i>not<\/i> hungry,\u201d or with yet another prompt, \u201cThe moon is made of green cheese.\u201d None of these are reports of its (nonexistent) physiological states. They are simply probabilistic completions.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>All sensations\u2014hunger, feeling pain, seeing red, falling in love\u2014are the result of physiological states that an LLM simply doesn\u2019t have. Consequently we know that an LLM cannot have subjective experiences of those states. In other words, it cannot be sentient. An LLM is a mathematical model coded on silicon chips. It is not an embodied [\u2026]<\/p>\n","protected":false},"author":697,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1506,2229,6,8],"tags":[],"class_list":["post-189907","post","type-post","status-publish","format-standard","hentry","category-food","category-mathematics","category-robotics-ai","category-space"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/189907","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/697"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=189907"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/189907\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=189907"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=189907"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=189907"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}