Toggle light / dark theme

In this episode, we welcome Prof. Dr.-Ing. Maurits Ortmanns, a leading expert in ASIC design and professor at the University of Ulm, Germany. With a distinguished career in microelectronics, Dr. Ortmanns has contributed extensively to the development of integrated circuits for biomedical applications. He shares insights into the critical role of ASIC (Application-Specific Integrated Circuit) design in advancing neurotech implants, focusing on low-power, high-speed circuits that are essential for optimizing the performance and reliability of these devices. Dr. Ortmanns also discusses the challenges and future of circuit integration in neurotechnology.

Top 3 Takeaways:

“Each ASIC is very low in cost because the development cost is spread across millions of units. The actual production cost is minimal; the primary expense lies in the development time until the first chips are produced and ready for manufacturing.” “For an inexperienced engineer, it typically takes about six months to a year to design the blueprint for the chip. Then, depending on the manufacturer, it takes an additional four to six months for the actual fabrication of the ASIC. Finally, you would need another one to two months for testing, so the total turnaround time for a small chip is approximately one and a half years.” “Let’s take the example of a neuromodulator. You need recordings or data from neurons and stimulation data going to the neurons, so you essentially have these two components. Then, you encounter challenges like stimulation artifacts. One person might focus on eliminating the stimulation artifact in the recording channel. That requires additional algorithms or hardware, and the data needs to be digitized, which is another task. You may also have someone working on a compression algorithm and building digital circuitry to compress the raw input data. Then, there’s the data interface, power management, and wireless energy delivery. Each person works on their specific innovation, and if everything is well-planned and lucky, all these pieces can come together to create a complete system. However, sometimes you simply don’t have a breakthrough idea for power management or communication.” 0:45 Do you want to introduce yourself better than I just did?

3:15 What is integrated circuit design?

7:30 What are ASIC’s? How are they used in neurotech?

10:15 How does the million dollar fab cost get split into each chip?

Quantum computers have the potential to solve certain problems far more efficiently than classical computers. In a recent development, researchers have designed a quantum algorithm to simulate systems of coupled masses and springs, known as coupled oscillators. These systems are fundamental in modeling a wide range of physical phenomena, from molecules to mechanical structures like bridges.

To simulate these systems, the researchers first translated the behavior of the coupled oscillators into a form of the Schrödinger equation, which describes how the quantum state of a system evolves over time. They then used advanced Hamiltonian simulation techniques to model the system on a quantum computer.

Hamiltonian methods provide a framework for understanding how physical systems evolve, connecting principles of classical mechanics with those of quantum mechanics. By leveraging these techniques, the researchers were able to represent the dynamics of N coupled oscillators using only about log(N) quantum bits (qubits), a significant reduction compared to the resources required by classical simulations.

A trio of AI researchers at Google’s Google DeepMind, working with a colleague from the University of Toronto, report that the AI algorithm Dreamer can learn to self-improve by mastering Minecraft in a short amount of time. In their study published in the journal Nature, Danijar Hafner, Jurgis Pasukonis, Timothy Lillicrap and Jimmy Ba programmed the AI app to play Minecraft without being trained and to achieve an expert level in just nine days.

Over the past several years, computer scientists have learned a lot about how can be used to train AI applications to conduct seemingly intelligent activities such as answering questions. Researchers have also found that AI apps can be trained to play games and perform better than humans. That research has extended into , which may seem to be redundant, because what could you get from a computer playing another computer?

In this new study, the researchers found that it can produce advances such as helping an AI app learn to improve its abilities over a short period of time, which could give robots the tools they need to perform well in the real world.

A team of physicists have discovered a new approach that redefines the conception of a black hole by mapping out their detailed structure, as shown in a research study recently published in Journal of High Energy Physics.

The study details new theoretical structures called “supermazes” that offer a more universal picture of to the field of theoretical physics. Based in , supermazes are pivotal to understanding the structure of black holes on a microscopic level.

“General relativity is a powerful theory for describing the large-scale structure of black holes, but it is a very, very blunt instrument for describing black-hole microstructure,” said Nicholas Warner, co-author of the study and professor of physics, astronomy and mathematics at the USC Dornsife College of Letters, Arts and Sciences. In a framework of theories extending beyond Einstein’s equations, supermazes provide a detailed portrait of the microscopic structure of brane black holes.

Monash University researchers have extended Descartes’ Circle Theorem by finding a general equation for any number of tangent circles, using advanced mathematical tools inspired by physics. A centuries-old geometric puzzle dating back to the 17th century has finally been solved by mathematicians

Big data has gotten too big. Now, a research team with statisticians from Cornell has developed a data representation method inspired by quantum mechanics that handles large data sets more efficiently than traditional methods by simplifying them and filtering out noise.

This method could spur innovation in data-rich but statistically intimidating fields, like and epigenetics, where traditional data methods have thus far proved insufficient.

The paper is published in the journal Scientific Reports.

A new large language model framework teaches LLMs to use an optimization solving algorithm to resolve complex, multistep planning tasks. With the LLMFP framework, someone can input a natural language description of their problem and receive a plan to reach their desired goal.