Editor’s Note: EDRM is proud to publish Ralph Losey’s advocacy and analysis. The opinions and positions are Ralph Losey’s copyrighted work. All images in the article are by Ralph Losey using AI. This article is published here with permission.]
The Nobel Prize in Physics was just awarded to quantum physics pioneers John Clarke, Michel H. Devoret, and John M. Martinis for discoveries they made at UC Berkeley in the 1980s. They proved that quantum tunneling, where subatomic particles can break through seemingly impenetrable barriers, can also occur in the macroscopic world of electrical circuits. So yes, Schrödinger’s cat really could die.
Two Sydney PhD students have pulled off a remarkable space science feat from Earth—using AI-driven software to correct image blurring in NASA’s James Webb Space Telescope. Their innovation, called AMIGO, fixed distortions in the telescope’s infrared camera, restoring its ultra-sharp vision without the need for a space mission.
Like humans, artificial intelligence learns by trial and error, but traditionally, it requires humans to set the ball rolling by designing the algorithms and rules that govern the learning process. However, as AI technology advances, machines are increasingly doing things themselves. An example is a new AI system developed by researchers that invented its own way to learn, resulting in an algorithm that outperformed human-designed algorithms on a series of complex tasks.
For decades, human engineers have designed the algorithms that agents use to learn, especially reinforcement learning (RL), where an AI learns by receiving rewards for successful actions. While learning comes naturally to humans and animals, thanks to millions of years of evolution, it has to be explicitly taught to AI. This process is often slow and laborious and is ultimately limited by human intuition.
Taking their cue from evolution, which is a random trial and error process, the researchers created a large digital population of AI agents. These agents tried to solve numerous tasks in many different, complex environments using a particular learning rule.
Would you allow a stranger to drive a camera-equipped computer around your living room? You might have already done so without even realizing it.
It all started innocently enough. I had recently bought an iLife A11 smart vacuum—a sleek, affordable, and technologically advanced robot promising effortless cleaning and intelligent navigation. As a curious engineer, I was fascinated by its workings. After leaving it to operate for the entire year, my curiosity got the better of me.
I’m a bit paranoid—the good kind of paranoid. So, I decided to monitor its network traffic, as I would with any so-called smart device.
*This video was recorded at ‘Paths to Progress’ at LabWeek hosted by Protocol Labs & Foresight Institute.*
Protocol Labs and Foresight Institute are excited to invite you to apply to a 5-day mini workshop series to celebrate LabWeek, PL’s decentralized conference to further public goods. The theme of the series, Paths to Progress, is aimed at (re)-igniting long overdue progress in longevity bio, molecular nanotechnology, neurotechnology, crypto & AI, and space through emerging decentralized, open, and technology-enabled funding mechanisms.
*This mini-workshop is focused on Paths to Progress in Molecular Nanotechnology* Molecular manufacturing, in its most ambitious incarnation, would use programmable tools to bring together molecules to make precisely bonded components in order to build larger structures from the ground up. This would enable general-purpose manufacturing of new materials and machines, at a fraction of current waste and price. We are currently nowhere near this ambitious goal. However, recent progress in sub-fields such as DNA nanotechnology, protein-engineering, STM, and AFM provide possible building blocks for the construction of a v1 of molecular manufacturing; the molecular 3D printer. Let’s explore the state of the art and what type of innovation mechanisms could bridge the valley of death: how might we update the original Nanotech roadmap; is a tech tree enough? how might we fund the highly interdisciplinary progress needed to succeed: FRO vs. DAO?
Shifted from slightly against to strongly in favor. 2023: half oppose, 2025: only 29 oppose. People fear new technology… until it is no longer new.
Expect this to happen with things like cell ag (lab grown meat), nanobots, and the like. Most people are not ideologically oppose to them, they just want enough time for them to prove themselves as safe.
“Opposition to autonomous vehicles is on the decline, the poll showed: In 2023, more than 50% of voters opposed driverless cars; now, it’s 29%.”
And:
“Two-thirds of voters said they support allowing fully autonomous vehicles to operate in San Francisco. It’s a significant increase from 2023, when fewer than half agreed with the sentiment.”
Founded in 1920, the NBER is a private, non-profit, non-partisan organization dedicated to conducting economic research and to disseminating research findings among academics, public policy makers, and business professionals.
Meet the caretaker AIs: guardians of planets, habitats, and civilizations. What happens when machines become the spirit and soul of the worlds they protect?
Grab one of our new SFIA mugs and make your morning coffee a little more futuristic — available now on our Fourthwall store! https://isaac-arthur-shop.fourthwall…
Visit our Website: http://www.isaacarthur.net. Join Nebula: https://go.nebula.tv/isaacarthur. Support us on Patreon: / isaacarthur. Support us on Subscribestar: https://www.subscribestar.com/isaac-a… Facebook Group: / 1583992725237264 Reddit: / isaacarthur. Twitter: / isaac_a_arthur on Twitter and RT our future content. SFIA Discord Server: / discord. Credits: Caretaker AI & Genus Loci 2025 Edition. Written, Produced & Narrated by: Isaac Arthur. Editors: Ludwig Luska. Graphics: Bryan Versteeg. Jeremy Jozwik. Ken York YD Visual. Kris Holland Mafic Studios. Select imagery/video supplied by Getty Images. Music Courtesy of Epidemic Sound http://epidemicsound.com/creator