Toggle light / dark theme

Visit https://brilliant.org/Veritasium/ to get started learning STEM for free, and the first 200 people will get 20% off their annual premium subscription. Digital computers have served us well for decades, but the rise of artificial intelligence demands a totally new kind of computer: analog.

Thanks to Mike Henry and everyone at Mythic for the analog computing tour! https://www.mythic-ai.com/
Thanks to Dr. Bernd Ulmann, who created The Analog Thing and taught us how to use it. https://the-analog-thing.org.
Moore’s Law was filmed at the Computer History Museum in Mountain View, CA.
Welch Labs’ ALVINN video: https://www.youtube.com/watch?v=H0igiP6Hg1k.

▀▀▀
References:
Crevier, D. (1993). AI: The Tumultuous History Of The Search For Artificial Intelligence. Basic Books. – https://ve42.co/Crevier1993
Valiant, L. (2013). Probably Approximately Correct. HarperCollins. – https://ve42.co/Valiant2013
Rosenblatt, F. (1958). The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, 65, 386–408. – https://ve42.co/Rosenblatt1958
NEW NAVY DEVICE LEARNS BY DOING; Psychologist Shows Embryo of Computer Designed to Read and Grow Wiser (1958). The New York Times, p. 25. – https://ve42.co/NYT1958
Mason, H., Stewart, D., and Gill, B. (1958). Rival. The New Yorker, p. 45. – https://ve42.co/Mason1958
Alvinn driving NavLab footage – https://ve42.co/NavLab.
Pomerleau, D. (1989). ALVINN: An Autonomous Land Vehicle In a Neural Network. NeurIPS, 1305-313. – https://ve42.co/Pomerleau1989
ImageNet website – https://ve42.co/ImageNet.
Russakovsky, O., Deng, J. et al. (2015). ImageNet Large Scale Visual Recognition Challenge. – https://ve42.co/ImageNetChallenge.
AlexNet Paper: Krizhevsky, A., Sutskever, I., Hinton, G. (2012). ImageNet Classification with Deep Convolutional Neural Networks. NeurIPS, (25)1, 1097–1105. – https://ve42.co/AlexNet.
Karpathy, A. (2014). Blog post: What I learned from competing against a ConvNet on ImageNet. – https://ve42.co/Karpathy2014
Fick, D. (2018). Blog post: Mythic @ Hot Chips 2018. – https://ve42.co/MythicBlog.
Jin, Y. & Lee, B. (2019). 2.2 Basic operations of flash memory. Advances in Computers, 114, 1–69. – https://ve42.co/Jin2019
Demler, M. (2018). Mythic Multiplies in a Flash. The Microprocessor Report. – https://ve42.co/Demler2018
Aspinity (2021). Blog post: 5 Myths About AnalogML. – https://ve42.co/Aspinity.
Wright, L. et al. (2022). Deep physical neural networks trained with backpropagation. Nature, 601, 49–555. – https://ve42.co/Wright2022
Waldrop, M. M. (2016). The chips are down for Moore’s law. Nature, 530144–147. – https://ve42.co/Waldrop2016

▀▀▀

▶ Check out Brilliant with this link to receive a 20% discount! https://brilliant.org/NewMind/

The millennia-old idea of expressing signals and data as a series of discrete states had ignited a revolution in the semiconductor industry during the second half of the 20th century. This new information age thrived on the robust and rapidly evolving field of digital electronics. The abundance of automation and tooling made it relatively manageable to scale designs in complexity and performance as demand grew. However, the power being consumed by AI and machine learning applications cannot feasibly grow as is on existing processing architectures.

THE MAC
In a digital neural network implementation, the weights and input data are stored in system memory and must be fetched and stored continuously through the sea of multiple-accumulate operations within the network. This approach results in most of the power being dissipated in fetching and storing model parameters and input data to the arithmetic logic unit of the CPU, where the actual multiply-accumulate operation takes place. A typical multiply-accumulate operation within a general-purpose CPU consumes more than two orders of magnitude greater than the computation itself.

GPUs.

▶ Check out Brilliant with this link to receive a 20% discount! https://brilliant.org/NewMind/

During the middle ages, the concept of the perpetual motion machine would develop. The first law, known as the Law of Conservation of Energy, would prohibit the existence of a perpetual motion machine, by preventing the creation or destruction of energy within an isolated system.

MAXWELL’S DEMON

In 1,867 James Clerk Maxwell, the Scottish pioneer of electromagnetism, conceived of a thermodynamic thought experiment that exhibited a key characteristic of a thermal perpetual motion machine. Because faster molecules are hotter, the “beings” actions cause one chamber to warm up and the other to cool down, seemingly reversing the process of a heat engine without adding energy.

As the poet Dylan Thomas once explained, it is “the force that through the green fuse drives the flower.”

Organic photochemistry brings life to Earth, allowing plants to “eat” sunlight. Using this power of light to make new molecules in the lab instead of the leaf, from fuel to pharmaceuticals, is one of the grand challenges of photochemical research.

What is old is new again. Sometimes gaining new insight requires a return to old tools, with a modern twist. Now, a collaborative team from the National Renewable Energy Laboratory (NREL) and Princeton University has resurrected a century-old microwave technique to reveal a surprising feature of well-established light-driven chemistry.

Researchers from the Max Planck Institute for Polymer Research have developed a drug that disrupts the adaptability of cancer cells!

Abstract: in situ assembly of platinum(ii)-metallopeptide nanostructures disrupts energy homeostasis and cellular metabolism.

https://pubs.acs.org/doi/10.1021/jacs.2c03215

Max Planck Institute for Polymer Research Press Release: https://www.mpip-mainz.mpg.de/en/press/pr-2022-09

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

The mainframe, the hardware stalwart that has existed for decades, is continuing to be a force in the modern era.

Among the vendors that still build mainframes is IBM, which today announced the latest iteration of its Linux-focused mainframe system, dubbed the LinuxOne Emperor 4. IBM has been building LinuxOne systems since 2015, when the first Emperor mainframe made its debut, and has been updating the platform on a roughly two-year cadence.

😲


A grim future awaits the United States if it loses the competition with China on developing key technologies like artificial intelligence in the near future, the authors of a special government-backed study told reporters on Monday.

If China wins the technological competition, it can use its advancements in artificial intelligence and biological technology to enhance its own country’s economy, military and society to the determent of others, said Bob Work, former deputy defense secretary and co-chair of the Special Competitive Studies Project, which examined international artificial intelligence and technological competition. Work is the chair of the U.S. Naval Institute Board of Directors.

Losing, in Work’s opinion, means that U.S. security will be threatened as China is able to establish global surveillance, companies will lose trillions of dollars and America will be reliant on China or other countries under Chinese influence for core technologies.

Microsoft’s augmented reality headset the HoloLens has been in the works for years now, but it’s been a while since we’ve heard any news. We were seeing demos of it way back in 2015 (opens in new tab), but Microsoft has been pretty quiet on the tech in recent years when it comes to a consumer release.

What we’ve heard tons about is Microsoft’s deal to supply the United States Army with HoloLens tech. We first got wind of the deal back in 2018 (opens in new tab) with talks of a $480 million contract to help “increase lethality” of combat missions. It wasn’t until 2021 that Microsoft officially signed a much pricier $22 billion dollar contract (opens in new tab) with the army for military grade HoloLens supply.

A team of scientists is using the tools offered by the HBP’s digital research infrastructure EBRAINS to address one of the oldest enigmas in neuroscience: the dichotomy of brain structure and function.

Every human brain is different. But even with structural differences, individual brains function in a similar way. In other words, there are functional brains based on completely different configurations. At the same time, a structural change may cause loss of function in one brain, but have no consequences in another individual. Or a drug cocktail may be efficient for one patient, and have no effects for another.