Imperial researchers have found that variability between brain cells might speed up learning and improve the performance of the brain and future artificial intelligence (AI).
The new study found that by tweaking the electrical properties of individual cells in simulations of brain networks, the networks learned faster than simulations with identical cells.
They also found that the networks needed fewer of the tweaked cells to get the same results and that the method is less energy-intensive than models with identical cells.
Full Story:
The research is published in Nature Communications.
Why is a neuron like a snowflake?
The brain is made up of billions of cells called neurons, which are connected by vast ‘neural networks’ that allow us to learn about the world. Neurons are like snowflakes: they look the same from a distance but on further inspection it’s clear that no two are exactly alike.
Comments are closed.