The human brain is nature’s most powerful processor, so it’s not surprising that developing computers that mimic it has been a long-term goal. Neural networks, the artificial intelligence systems that learn in a very human-like way, are the closest models we have, and now Stanford scientists have developed an organic artificial synapse, inching us closer to making computers more efficient learners.
In an organic brain, neuronal cells send electrical signals to each other to process and store information. Neurons are separated by small gaps called synapses, which allow the cells to pass the signals to each other, and every time that crossing is made, that connection gets stronger, requiring less energy each time after. That strengthening of a connection is how the brain learns, and the fact that processing the information also stores it is what makes the brain such a lean, mean, learning machine.
Neural networks model this on a software level. These AI systems are great for handling huge amounts of data, and like the human brain that inspired them, the more information they’re fed, the better they become at their job. Recognizing and sorting images and sounds are their main area of expertise at the moment, and these systems are driving autonomous cars, beating humanity’s best Go players, creating trippy works of art and even teaching each other. The problem is, these intelligent software systems are still running on traditional computer hardware, meaning they aren’t as energy efficient as they could be.
Continue reading “Artificial synapse bridges the gap to brainier computers” »