EPFL researchers have developed an algorithm to train an analog neural network just as accurately as a digital one, enabling the development of more efficient alternatives to power-hungry deep learning hardware.
With their ability to process vast amounts of data through algorithmic ‘learning’ rather than traditional programming, it often seems like the potential of deep neural networks like Chat-GPT is limitless. But as the scope and impact of these systems have grown, so have their size, complexity, and energy consumption —the latter of which is significant enough to raise concerns about contributions to global carbon emissions.
While we often think of technological advancement in terms of shifting from analog to digital, researchers are now looking for answers to this problem in physical alternatives to digital deep neural networks. One such researcher is Romain Fleury of EPFL’s Laboratory of Wave Engineering in the School of Engineering.
Comments are closed.