Toggle light / dark theme

Researchers from University of Copenhagen have developed a new technique that keeps quantum bits of light stable at room temperature instead of only working at-270 degrees. Their discovery saves power and money and is a breakthrough in quantum research.

As almost all our private information is digitalized, it is increasingly important that we find ways to protect our data and ourselves from being hacked.

Quantum Cryptography is the researchers’ answer to this problem, and more specifically a certain kind of qubit — consisting of single photons: particles of light.

Given the importance of the Kirkwood–Dirac quasiprobability’s nonclassical values, two natural questions arise: Under what conditions does this quasiprobability behave anomalously? And how anomalous can its behaviour get? That’s what we wanted to explore.

What did you do in the paper?

We pinned down conditions under which the Kirkwood–Dirac quasiprobability assumes nonclassical values. Using these conditions, one can calculate which experiments can exhibit certain types of quantum advantages. We also put a “ceiling” on how much nonclassicality one Kirkwood–Dirac quasiprobability distribution can contain.

In search for a unifying quantum gravity theory that would reconcile general relativity with quantum theory, it turns out quantum theory is more fundamental, after all. Quantum mechanical principles, some physicists argue, apply to all of reality (not only the realm of ultra-tiny), and numerous experiments confirm that assumption. After a century of Einsteinian relativistic physics gone unchallenged, a new kid of the block, Computational Physics, one of the frontrunners for quantum gravity, states that spacetime is a flat-out illusion and that what we call physical reality is actually a construct of information within [quantum neural] networks of conscious agents. In light of the physics of information, computational physicists eye a new theory as an “It from Qubit” offspring, necessarily incorporating consciousness in the new theoretic models and deeming spacetime, mass-energy as well as gravity emergent from information processing.

In fact, I expand on foundations of such new physics of information, also referred to as [Quantum] Computational Physics, Quantum Informatics, Digital Physics, and Pancomputationalism, in my recent book The Syntellect Hypothesis: Five Paradigms of the Mind’s Evolution. The Cybernetic Theory of Mind I’m currently developing is based on reversible quantum computing and projective geometry at large. This ontological model, a “theory of everything” of mine, agrees with certain quantum gravity contenders, such as M-Theory on fractal dimensionality and Emergence Theory on the code-theoretic ontology, but admittedly goes beyond all current models by treating space-time, mass-energy and gravity as emergent from information processing within a holographic, multidimensional matrix with the Omega Singularity as the source.

There’s plenty of cosmological anomalies of late that make us question the traditional interpretation of relativity. First off, what Albert Einstein (1879 — 1955) himself called “the biggest blunder” of his scientific career – t he rate of the expansion of our Universe, or the Hubble constant – is the subject of a very important discrepancy: Its value changes based how scientists try to measure it. New results from the Hubble Space Telescope have now “raised the discrepancy beyond a plausible level of chance,” according to one of the latest papers published in the Astrophysical Journal. We are stumbling more often on all kinds of discrepancies in relativistic physics and the standard cosmological model. Not only the Hubble constant is “constantly” called into question but even the speed of light, if measured by different methods, and on which Einsteinian theories are based upon, shows such discrepancies and turns out not really “constant.”

Circa 2019


As quantum computing enters the industrial sphere, questions about how to manufacture qubits at scale are becoming more pressing. Here, Fernando Gonzalez-Zalba, Tsung-Yeh Yang and Alessandro Rossi explain why decades of engineering may give silicon the edge.

In the past two decades, quantum computing has evolved from a speculative playground into an experimental race. The drive to build real machines that exploit the laws of quantum mechanics, and to use such machines to solve certain problems much faster than is possible with traditional computers, will have a major impact in several fields. These include speeding up drug discovery by efficiently simulating chemical reactions; better uses of “big data” thanks to faster searches in unstructured databases; and improved weather and financial-market forecasts via smart optimization protocols.

We are still in the early stages of building these quantum information processors. Recently, a team at Google has reportedly demonstrated a quantum machine that outperforms classical supercomputers, although this so-called “quantum supremacy” is expected to be too limited for useful applications. However, this is an important milestone in the field, testament to the fact that progress has become substantial and fast paced. The prospect of significant commercial revenues has now attracted the attention of large computing corporations. By channelling their resources into collaborations with academic groups, these firms aim to push research forward at a faster pace than either sector could accomplish alone.

Toshiba’s Cambridge Research Laboratory has achieved quantum communications over optical fibres exceeding 600 km in length, three times further than the previous world record distance.

The breakthrough will enable long distance, quantum-secured information transfer between metropolitan areas and is a major advance towards building a future Quantum Internet.

The term “Quantum Internet” describes a global network of quantum computers, connected by long distance quantum communication links. This technology will improve the current Internet by offering several major benefits – such as the ultra-fast solving of complex optimisation problems in the cloud, a more accurate global timing system, and ultra-secure communications. Personal data, medical records, bank details, and other information will be physically impossible to intercept by hackers. Several large government initiatives to build a Quantum Internet have been announced in China, the EU and the USA.

German nanotechnology specialist attocube says its attoDRY800 cryostat enables quantum scientists to “reclaim the optical table” and focus on their research not the experimental set-up.

Twin-track innovations in cryogenic cooling and optical table design are “creating the space” for fundamental scientific breakthroughs in quantum communications, allowing researchers to optimize the performance of secure, long-distance quantum key distribution (QKD) using engineered single-photon-emitting light sources.

In a proof-of-concept study last year, Tobias Heindel and colleagues in the Institute of Solid State Physics at the Technische Universität (TU) Berlin, Germany, implemented a basic QKD testbed in their laboratory. The experimental set-up uses a semiconductor quantum-dot emitter to send single-photon pulses along an optical fibre to a four-port receiver that analyses the polarization state of the transmitted qubits.