Toggle light / dark theme

In Neuromorphic Computing Part 2, we dive deeper into mapping neuromorphic concepts into chips built from silicon. With the state of modern neuroscience and chip design, the tools the industry is working with we’re working with are simply too different from biology. Mike Davies, Senior Principal Engineer and Director of Intel’s Neuromorphic Computing Lab, explains the process and challenge of creating a chip that can replicate some of the form and functions in biological neural networks.

Mike’s leadership in this specialized field allows him to share the latest insights from the promising future in neuromorphic computing here at Intel. Let’s explore nature’s circuit design of over a billion years of evolution and today’s CMOS semiconductor manufacturing technology supporting incredible computing efficiency, speed and intelligence.

Architecture All Access Season 2 is a master class technology series, featuring Senior Intel Technical Leaders taking an educational approach in explaining the historical impact and future innovations in their technical domains. Here at Intel, our mission is to create world-changing technology that improves the life of every person on earth. If you would like to learn more about AI, Wi-Fi, Ethernet and Neuromorphic Computing, subscribe and hit the bell to get instant notifications of new episodes.

Jump to Chapters:

Computer design has always been inspired by biology, especially the brain. In this episode of Architecture All Access — Mike Davies, Senior Principal Engineer and Director of Intel’s Neuromorphic Computing Lab — explains the relationship of Neuromorphic Computing and understanding the principals of brain computations at the circuit level that are enabling next-generation intelligent devices and autonomous systems.

Mike’s leadership in this specialized field allows him to share the latest insights from the promising future in neuromorphic computing here at Intel. Discover the history and influence of the secrets that nature has evolved over a billion years supporting incredible computing efficiency, speed and intelligence.

Architecture All Access Season 2 is a master class technology series, featuring Senior Intel Technical Leaders taking an educational approach in explaining the historical impact and future innovations in their technical domains. Here at Intel, our mission is to create world-changing technology that improves the life of every person on earth. If you would like to learn more about AI, Wi-Fi, Ethernet and Neuromorphic Computing, subscribe and hit the bell to get instant notifications of new episodes.

Chapters:

Computer simulations of complex systems provide an opportunity to study their time evolution under user control. Simulations of neural circuits are an established tool in computational neuroscience. Through systematic simplification on spatial and temporal scales they provide important insights in the time evolution of networks which in turn leads to an improved understanding of brain functions like learning, memory or behavior. Simulations of large networks are exploiting the concept of weak scaling where the massively parallel biological network structure is naturally mapped on computers with very large numbers of compute nodes. However, this approach is suffering from fundamental limitations. The power consumption is approaching prohibitive levels and, more seriously, the bridging of time-scales from millisecond to years, present in the neurobiology of plasticity, learning and development is inaccessible to classical computers. In the keynote I will argue that these limitations can be overcome by extreme approaches to weak and strong scaling based on brain-inspired computing architectures.

Bio: Karlheinz Meier received his PhD in physics in 1984 from Hamburg University in Germany. He has more than 25years of experience in experimental particle physics with contributions to 4 major experiments at particle colliders at DESY in Hamburg and CERN in Geneva. For the ATLAS experiment at the Large Hadron Collider (LHC) he led a 15 year effort to design, build and operate an electronics data processing system providing on-the-fly data reduction by 3 orders of magnitude enabling among other achievements the discovery of the Higgs Boson. Following scientific staff positions at DESY and CERN he was appointed full professor of physics at Heidelberg university in 1992. In Heidelberg he co-founded the Kirchhoff-Institute for Physics and a laboratory for the development of microelectronic circuits for science experiments. In particle physics he took a leading international role in shaping the future of the field as president of the European Committee for Future Accelerators (ECFA). Around 2005 he gradually shifted his scientific interests towards large-scale electronic implementations of brain-inspired computer architectures. His group pioneered several innovations in the field like the conception of a description language for neural circuits (PyNN), time-compressed mixed-signal neuromorphic computing systems and wafer-scale integration for their implementation. He led 2 major European initiatives, FACETS and BrainScaleS, that both demonstrated the rewarding interdisciplinary collaboration of neuroscience and information science. In 2009 he was one of the initiators of the European Human Brain Project (HBP) that was approved in 2013. In the HBP he leads the subproject on neuromorphic computing with the goal of establishing brain-inspired computing paradigms as tools for neuroscience and generic methods for inference from large data volumes.

Does the Moon’s crust have more water than previously thought? This is what a recent study published in Nature Astronomy hopes to figure out as a team of international researchers investigated how the mineral apatite found within a Moon meteorite provides greater insight into how the Moon’s early crust from billions of years ago could have possessed higher amounts of water than scientists have previously hypothesized. This study holds the potential to not only help scientists better understand lunar history but also provide a gateway to unlocking lunar water for future astronaut missions, as well.

“The discovery of apatite in the Moon’s early crust for the first time is incredibly exciting – as we can finally start to piece together this unknown stage of lunar history,” said Dr. Tara Hayden, who is a postdoctoral associate at Western University and lead author of the study. “We find the Moon’s early crust was richer in water than we expected, and its volatile stable isotopes reveal an even more complex history than we knew before.”

“The missing piece is AI,” he says.

AI has also shown promise in getting robots to respond to verbal commands, and helping them adapt to the often messy environments in the real world. For example, Google’s RT-2 system combines a vision-language-action model with a robot. This allows the robot to “see” and analyze the world, and respond to verbal instructions to make it move. And a new system called AutoRT from DeepMind uses a similar vision-language model to help robots adapt to unseen environments, and a large language model to come up with instructions for a fleet of robots.

And now for the bad news: even the most cutting-edge robots still cannot do laundry. It’s a chore that is significantly harder for robots than for humans. Crumpled clothes form weird shapes which makes it hard for robots to process and handle.

A newly invented fuel cell taps into naturally present, and ubiquitous microbes in the soil to generate power.

This soil-powered device, about the size of a regular paperback book, offers a viable alternative to batteries in underground sensors used for precision agriculture.

Northwestern University highlighted the durability of its robust fuel cell, showcasing its ability to withstand various environmental conditions, including both arid soil and flood-prone areas.