The team turned AlphaEvolve loose on Google’s Borg cluster management system for its data centers. The AI suggested a change to the scheduling heuristics, which has been implemented to save Google 0.7 percent on its computing resources globally. For a company the size of Google, that’s a significant financial benefit.
AlphaEvolve may also be able to make generative AI more efficient, which is necessary if anyone is ever going to make money on the technology. The internal workings of generative systems are based on matrix multiplication operations. The most efficient way to multiply 4×4 complex-valued matrices was devised by mathematician Volker Strassen in 1969, and that held for decades, but DeepMind says AlphaEvolve has discovered a new algorithm that’s even more efficient. DeepMind has worked on this problem before with narrowly trained AI agents like AlphaTensor. Despite being a general AI, AlphaEvolve came up with a better solution than AlphaTensor.
Google’s next-generation Tensor processing hardware will also benefit from AlphaEvolve. DeepMind reports that the AI created a change to the chip’s Verilog hardware description language that dropped unnecessary bits to increase efficiency. Google is still working to verify the change but expects this to be part of the upcoming processor.