Toggle light / dark theme

face_with_colon_three This looks awesome :3.


There is great interest in using quantum computers to efficiently simulate a quantum system’s dynamics as existing classical computers cannot do this. Little attention, however, has been given to quantum simulation of a classical nonlinear continuum system such as a viscous fluid even though this too is hard for classical computers. Such fluids obey the Navier–Stokes nonlinear partial differential equations, whose solution is essential to the aerospace industry, weather forecasting, plasma magneto-hydrodynamics, and astrophysics. Here we present a quantum algorithm for solving the Navier–Stokes equations. We test the algorithm by using it to find the steady-state inviscid, compressible flow through a convergent-divergent nozzle when a shockwave is (is not) present.

In 2015, the Laser Interferometer Gravitational-Wave Observatory (LIGO), made history when it made the first direct detection of gravitational waves—ripples in space and time—produced by a pair of colliding black holes.

Since then, LIGO and its sister detector in Europe, Virgo, have detected gravitational waves from dozens of mergers between black holes as well as from collisions between a related class of stellar remnants called neutron stars. At the heart of LIGO’s success is its ability to measure the stretching and squeezing of the fabric of space-time on scales 10 thousand trillion times smaller than a human hair.

As incomprehensibly small as these measurements are, LIGO’s precision has continued to be limited by the laws of quantum physics. At very tiny, subatomic scales, empty space is filled with a faint crackling of quantum noise, which interferes with LIGO’s measurements and restricts how sensitive the observatory can be.

When scientists measure a particle, it seems to collapse to one fixed state. Yet no one can be sure what’s causing collapse, also called reduction of the state. Some scientists and philosophers even think that wave function collapse is an elaborate illusion. This debate is called the measurement problem in quantum mechanics.

The measurement problem has led many physicists and philosophers to believe that a conscious observer is somehow acting on quantum particles. One proposal is that a conscious observer causes collapse. Another theory is that a conscious observer causes the universe to split apart, spiralling out alternate realities. These worlds would be parallel yet inaccessible to us so that we only ever see things in one single state in whatever possible world we’re stuck in. This is the Multiverse or Many Worlds theory. “The point of view that it is consciousness that reduces the state is really an absurdity,” says Penrose, adding that a belief in Many Worlds is a phase that every physicist, including himself, eventually outgrows. “I shouldn’t be so blunt because very distinguished people seem to have taken that view.” Penrose demurs. He politely but unequivocally waves off the idea that a conscious observer collapses wave functions by looking at them. Likewise, he dismisses the view that a conscious observer spins off near infinite universes with a glance. “That’s making consciousness do the job of collapsing the wave function without having a theory of consciousness,” says Penrose. “I’m turning it around and I’m saying whatever consciousness is, for quite different reasons, I think it does depend on the collapse of the wave function. On that physical process.”

What’s causing collapse? “It’s an objective phenomenon,” insists Penrose. He’s convinced this objective phenomenon has to be the fundamental force: gravity. Gravity is a central player in all of classical physics conspicuously missing from quantum mechanics.

The Chicago region has been named an official US Regional and Innovation Technology Hub for quantum technologies by the Biden-Harris administration, a designation that opens the door to new federal funding and recognizes the growing strength of an ecosystem poised to become the heart of the nation’s quantum economy. The Bloch Tech Hub (pronounced “block”), a coalition of industry, academic, government, and nonprofit stakeholders led by the Chicago Quantum Exchange, was one of 31 designees from nearly 400 applications across the country.

The selection, announced Monday morning by the White House and the US Department of Commerce’s Economic Development Administration (EDA), is the first phase of a federal initiative designed to “supercharge” innovation economies that have the potential to become global leaders in a critical technology within a decade. As a recipient of the US Tech Hubs designation, The Bloch is now eligible to apply for the program’s second phase, which could include millions of dollars in funding to implement the hub’s activities. It was one of two US Tech Hubs designated in Illinois, the other focused on biomanufacturing.

“Home to world-class institutions and first-rate research centers, Illinois is transforming technology, biomanufacturing, and innovation at every turn,” said Illinois Governor JB Pritzker. “I couldn’t be prouder that the Biden Administration has selected the Chicago Quantum Exchange’s The Bloch and the University of Illinois at Urbana-Champaign’s iFAB Hub as two of just 31 inaugural tech hubs — opening the door for even more investment, advancement, and discovery. There’s no doubt that the rest of the nation have caught on to our great state’s status as an innovation powerhouse — and our future couldn’t be brighter.”

Year 2013 I believe I have seen maybe we are actually in a sorta matrix due to holographic quantum fluid that is throughout reality much like the fabric of the universe but is actually a permeable membrane of reality of quantum fluid. This article details that fluidity of water mimics near all quantum mechanics which would then show then reality may be a holographic quantum fluid as well.


MIT researchers expand the range of quantum behaviors that can be replicated in fluidic systems, offering a new perspective on wave-particle duality.

The headline challenge for building a quantum computer is well known: the quantum states exhibited by such a computer’s computational building blocks—its qubits—must be long-lived and robust against disruption by the environment. But even the most resilient qubits are useless for quantum computing if they can’t be combined in sufficient numbers. Maciej Malinowski at Oxford Ionics, UK, and his colleagues have now tackled this problem through a more efficient architecture for controlling qubits [1]. Applying their “Wiring using Integrated Switching Electronics” (WISE) approach to trapped-ion qubits specifically, they present a design for a quantum computer with 1,000 qubits—far more than the few tens of qubits that make up the largest commercially available trapped-ion device currently available.

Trapped-ion quantum computers share much of their solid-state chip technology with modern classical computers, but they have added complexity. Whereas the bits in a classical computer are written and read using simple signals sent via a small number of electrodes, the qubits in a trapped-ion computer are controlled using subtler, more varied signals, which are delivered by as many as ten separate electrodes per qubit. As the number of qubits in a quantum computer increases, fitting these electrodes and signal generators on the chip—not to mention dissipating the heat that they generate—gets more difficult.

In their WISE approach, Malinowski and his colleagues use fewer signal generators and move them off the chip. Instead of every individual qubit having its own dedicated control structure, the signal from one signal generator is relayed to multiple qubits via a small number of local switches. Malinowski says that a trapped-ion quantum computer employing their control method could be built using existing semiconductor fabrication techniques.

Some have described the last several millennia of human dominion over the earth’s resources as the anthropocene, deriving from the Greek “anthropo” meaning human, and “cene” meaning recent. The last century in particular has been dubbed the fourth industrial revolution, due to the pace of technological innovation ushered in by the advent of computers in the middle of the 20th century.

In the past seventy years, computation has transformed every aspect of society, enabling efficient production at an accelerated rate, displacing human labour from chiefly production to services, and exponentially augmenting information storage, generation, and transmission through telecommunications.

How did we get here? Fundamentally, technological advancement draws on existing science. Without an understanding of the nature of electromagnetism and the structure of atoms, we wouldn’t have electricity and the integrated circuitry that power computers. It was only a matter of time, then, before we thought of exploiting the most accurate, fundamental description of physical reality provided by quantum mechanics for computation.

Quantum mechanics is full of weird phenomena, but perhaps none as weird as the role measurement plays in the theory. Since a measurement tends to destroy the “quantumness” of a system, it seems to be the mysterious link between the quantum and classical world.

Furthermore, when dealing with a vast system of quantum data units called “qubits,” the impact of measurements can lead to profoundly different outcomes, even driving the emergence of entirely new phases of quantum information.

This happens when two competing effects come to a head: interactions and measurement. In a quantum system, when the qubits interact with one another, their information becomes shared nonlocally in an “entangled state.”