Could computers ever learn more like humans do, without relying on artificial intelligence (AI) systems that must undergo extremely expensive training?
Neuromorphic computing might be the answer. This emerging technology features brain-inspired computer hardware that could perform AI tasks much more efficiently with far fewer training computations using much less power than conventional systems. Consequently, neuromorphic computers also have the potential to reduce reliance on energy-intensive data centers and bring AI inference and learning to mobile devices.
Dr. Joseph S. Friedman, associate professor of electrical and computer engineering at The University of Texas at Dallas, and his team of researchers in the NeuroSpinCompute Laboratory have taken an important step forward in building a neuromorphic computer by creating a small-scale prototype that learns patterns and makes predictions using fewer training computations than conventional AI systems. Their next challenge is to scale up the proof-of-concept to larger sizes.








