In a recently published article, I reviewed over 100 years of neuroscience research to see if some brain regions are more important than others for consciousness. What I found suggests scientists who study consciousness may have been undervaluing the most ancient regions of human brains.
Consciousness is usually defined by neuroscientists as the ability to have subjective experience, such as the experience of tasting an apple or of seeing the redness of its skin.
The leading theories of consciousness suggest that the outer layer of the human brain, called the cortex (in blue in figure 1), is fundamental to consciousness. This is mostly composed of the neocortex, which is newer in our evolutionary history.
The human subcortex (figure 1, brown/beige), underneath the neocortex, has not changed much in the last 500 million years. It is thought to be like electricity for a TV, necessary for consciousness, but not enough on its own.
There is another part of the brain that some neuroscientific theories of consciousness state is irrelevant for consciousness. This is the cerebellum, which is also older than the neocortex and looks like a little brain tucked in the back of the skull (figure 1, purple). Brain activity and brain networks are disrupted in unconsciousness (like in a coma). These changes can be seen in the cortex, subcortex and cerebellum.
As part of my analysis I looked at studies showing what happens to consciousness when brain activity is changed, for example, by applying electrical currents or magnetic pulses to brain regions.
These experiments in humans and animals showed that altering activity in any of these three parts of the brain can alter consciousness. Changing the activity of the neocortex can change your sense of self, make you hallucinate, or affect your judgment.