Menu

Blog

May 23, 2024

No, Today’s AI Isn’t Sentient. Here’s How We Know

Posted by in categories: food, mathematics, robotics/AI, space

All sensations—hunger, feeling pain, seeing red, falling in love—are the result of physiological states that an LLM simply doesn’t have. Consequently we know that an LLM cannot have subjective experiences of those states. In other words, it cannot be sentient.

An LLM is a mathematical model coded on silicon chips. It is not an embodied being like humans. It does not have a “life” that needs to eat, drink, reproduce, experience emotion, get sick, and eventually die.

It is important to understand the profound difference between how humans generate sequences of words and how an LLM generates those same sequences. When I say “I am hungry,” I am reporting on my sensed physiological states. When an LLM generates the sequence “I am hungry,” it is simply generating the most probable completion of the sequence of words in its current prompt. It is doing exactly the same thing as when, with a different prompt, it generates “I am not hungry,” or with yet another prompt, “The moon is made of green cheese.” None of these are reports of its (nonexistent) physiological states. They are simply probabilistic completions.

Leave a reply