Menu

Blog

Jul 28, 2024

Breakthrough CRAM technology ditches von Neumann model, makes AI 1,000x more energy efficient

Posted by in categories: particle physics, robotics/AI

Futurology: The global demand for AI computing has data centers consuming electricity like frat houses chug beer. But researchers from the University of Minnesota might have a wildly innovative solution to curb AI’s growing thirst for power with a radical new device that promises vastly superior energy efficiency.

The researchers have designed a new “computational random-access memory” (CRAM) prototype chip that could reduce energy needs for AI applications by a mind-boggling 1,000 times or more compared to current methods. In one simulation, the CRAM tech showed an incredible 2,500x energy savings.

Traditional computing relies on the decades-old von Neumann architecture of separate processor and memory units, which requires constantly moving data back and forth in an energy-intensive process. The Minnesota team’s CRAM completely upends that model by performing computations directly within the memory itself using spintronic devices called magnetic tunnel junctions (MTJs).

Leave a reply