![](https://lifeboat.com/blog.images/if-we-want-artificial-superintelligence-it-may-need-to-feel-pain.jpg)
“It might be that to get superhuman intelligence, you do need some level of sentience. We can’t rule that out either; it’s entirely possible. Some people argue that that kind of real intelligence requires sentience and that sentience requires embodiment. Now, there is a view in philosophy, called computational functionalism, that [argues] sentience, sapience, and selfhood could just be the computations they perform rather than the body they’re situated in. And if that view is correct, then it’s entirely possible that by recreating the computations the brain performs in AI systems, we also thereby recreate the sentience as well.”
Birch is saying three things here. First, it’s reasonable to suggest that “superintelligence” requires sentience. Second, we could potentially recreate sentience in AI with certain computations. Therefore, if we want AI to reach “superintelligence” we would need it to be sentient. We would need AI to feel things. ChatGPT needs to know pain. Gemini needs to experience euphoria.
The fact that underlies Birch’s book and our conversation is that intelligence is not some deus ex machina dropped from the sky. It is not some curious alien artifact uncovered in a long, lost tomb. It’s nested within an unfathomably long evolutionary chain. It’s the latest word in a long sentence. But the question Birch raises is: Where does AI fit in the book of evolved intelligence?