Competition between the U.S. and China in quantum computing revolves, in part, around the role such a system could play in breaking the encryption that makes things secure on the internet.
Truly useful quantum computing applications could be as much as a decade away, Aaronson says. Initially, these tools would be highly specialized.
“The way I put it is that we’re now entering the very, very early, vacuum-tube era of quantum computers,” he says.
Massive-scale particle physics produces correspondingly large amounts of data – and this is particularly true of the Large Hadron Collider (LHC), the world’s largest particle accelerator, which is housed at the European Organization for Nuclear Research (CERN) in Switzerland. In 2026, the LHC will receive a massive upgrade through the High Luminosity LHC (HL-LHC) Project. This will increase the LHC’s data output by five to seven times – billions of particle events every second – and researchers are scrambling to prepare big data computing for this deluge of particle physics data. Now, researchers at Lawrence Berkeley National Laboratory are working to tackle high volumes of particle physics data with quantum computing.
When a particle accelerator runs, particle detectors offer data points for where particles crossed certain thresholds in the accelerator. Researchers then attempt to reconstruct precisely how the particles traveled through the accelerator, typically using some form of computer-aided pattern recognition.
This project, which is led by Heather Gray, a professor at the University of California, Berkeley, and a particle physicist at Berkeley Lab, is called Quantum Pattern Recognition for High-Energy Physics (or HEP.QPR). In essence, HEP.QPR aims to use quantum computing to speed this pattern recognition process. HEP.QPR also includes Berkeley Lab scientists Wahid Bhimji, Paolo Calafiura and Wim Lavrijsen.
It is in this second phase when Darwinian evolutionary rivers will merge with the rivers of intelligent designers, represented by scientists, programmers and engineers, who will fuse organic natural biology, synthetic biology, and digital technology into a unified whole that future generations will deem their anatomy. The merger will serve to afford greater intelligence and, longer, healthier lives. In exchange, we will relinquish actual autonomy for apparent autonomy, where what was once considered “free will” will be supplanted by the deterministic logic of machinery somewhere in the mainstream of our unconscious.
Although in-the-body technology will have an explosive effect on commerce, entertainment, and employment, in the near term the concentration will be on medical devices, such as the innocuous pacemaker (essentially a working silicon-based computer, with sensors, memories, and a stimulation device with telecommunications to the outer world). In a second epoch, these devices will be gradually down-sized by advances in synthetic DNA, molecular- and nano-sized processors, each deployed alongside and within cells and organs as permanent non-organic, internal adjuncts to our anatomy for use as: nano-prosthetics, nano-stimulators/suppressors, artificial organ processors, metabolic and cognitive enhancers, and permanent diagnostic tools to ensure our physical and psychological well-being as we head toward a practically interminable lifetime.[6]
Will a wide-spread practice of installing technology into the body fundamentally change human essence? Our sense of self-sufficiency, authenticity, or individual identity? Will it change that numerical identity, the one “I” as some static aspect of ourselves (as self-consciousness as idealized by Locke)? Or will it change our narrative identity, our unseen internal human form, to eventually redefine what it means to be human?[7].
Now, five years later, their gamble appears to have paid off. Not only did New Horizons achieve a next-to flawless flyby of Arrokoth, the most distant object ever visited, but buried in its gigabytes of data—which have been trickling back to Earth ever since the New Year’s Day 2019 rendezvous—lies empirical evidence that strikes against a classic theory of how planets form. The New Horizons team published their latest analysis of the ancient body and how it came to be in a trio of papersappearing in Sciencelast week.
Labs around the world are racing to develop new computing and sensing devices that operate on the principles of quantum mechanics and could offer dramatic advantages over their classical counterparts. But these technologies still face several challenges, and one of the most significant is how to deal with “noise”—random fluctuations that can eradicate the data stored in such devices.
A new approach developed by researchers at MIT could provide a significant step forward in quantum error correction. The method involves fine-tuning the system to address the kinds of noise that are the most likely, rather than casting a broad net to try to catch all possible sources of disturbance.
The analysis is described in the journal Physical Review Letters, in a paper by MIT graduate student David Layden, postdoc Mo Chen, and professor of nuclear science and engineering Paola Cappellaro.
Astronomers are rediscovering how calculations made by the ‘human computer’ Elizabeth Williams contributed to the first observations of Pluto 90 years ago.
After being blind for 16 years, scientists have plugged a bionic eye directly into Bernardeta Gomez’s brain, allowing her to see again without using her biological eyes after she had a computer port surgically embedded into her skull.
The vision system is being honed by neuriengineer Eduardo Fernandez in his lab at the University of Miguel Hernandez, and it is comprised of a few different parts according to the publication in MIT Technology Review.
There is a pair of glasses that are fitted with a camera that connects to a computer which translates the live video feed into electronic signals that are then sent via a cable to the port which has been surgically embedded into the back of Gomez’s skull and connects to an implant in the visual cortex of her brain.