Toggle light / dark theme

Producing photons one at a time on demand at room temperature is a key requirement for the rollout of a quantum internet—and the practical quantum computers that would undergird that network. The photons can be used as quantum bits (qubits), the quantum equivalent of classical computing’s 0s and 1s. Labs around the world have devised various ways to generate single photons, but they can involve complex engineering techniques such as doped carbon nanotubes or costly cryogenically-cooled conditions. On the other hand, less complicated techniques such as using traditional light sources do not provide the necessary level of control over single-photon emissions required for quantum networks and computers.

Now, researchers from Tokyo University of Science (TUS) and the Okinawa Institute of Science and Technology have collaborated to develop a prototype room temperature single-photon light source using standard materials and methods. The team described the fabrication of the prototype and its results in a recent issue of the journal Physical Review Applied.

“Our single-photon light source … increases the potential to create quantum networks—a quantum internet—that are cost-effective and accessible.” —Kaoru Sanaka, Tokyo University of Science.

Over the past twenty years, many companies, including Google, Microsoft, and IBM, have invested in quantum computing development. Investors have contributed over $5 billion to this cause. The aim is to use quantum physics properties to process information in ways that traditional computers cannot. Quantum computing could impact various fields, including drug discovery, cryptography, finance, and supply-chain logistics. However, the excitement around this technology has led to a mix of claims, making it hard to gauge the actual progress.

The main challenge in developing quantum computers is managing the ‘noise’ that can interfere with these sensitive systems. Quantum systems can be disrupted by disturbances like stray photons from heat, random signals from nearby electronics, or physical vibrations. This noise can cause errors or stop a quantum computation. Regardless of the processor size or the technology’s potential uses, a quantum computer will not surpass a classical computer unless the noise is controlled.

For a while, researchers thought they might have to tolerate some noise in their quantum systems, at least temporarily. They looked for applications that could still work effectively with this constraint. However, recent theoretical and experimental advances suggest that the noise issue might soon be resolved. A mix of hardware and software strategies is showing potential for reducing and correcting quantum errors. Earl Campbell, vice president of quantum science at Riverlane, a UK-based quantum computing company, believes there is growing evidence to be hopeful about quantum computing’s future.

Scientific knowledge can progress rapidly, yet its social, economic, and political impacts often unfold at a painstakingly slow pace. The medicine of the 21st century draws upon genetic and embryological breakthroughs of the 19th century. Our current technology is firmly grounded in quantum physics, which was formulated a century ago. And the topic of the day, artificial intelligence (AI), traces its origins to the secret weapons research during World War II.

‌In 1935, the brilliant British mathematician, Alan Turing, envisioned a conceptual computer. His genius would later lead him to crack the Enigma code used by German submarines for secret communications during the war. Turing’s contributions extended beyond cryptography, as he introduced fundamental concepts of AI, including the training of artificial neural networks. Benedict Cumberbatch portrayed Turing in the 2014 film The Imitation Game, which earned a screenplay Oscar that year. All this historical context brings us to the heart of the current AI revolution.

‌AI uses neural networks, also known as artificial neural networks, which are comprised of multiple layers of artificial neurons. Each neuron receives numerous inputs from the lower layer and produces a single output to the upper layer, similar to the dendrites and axon of natural neurons. As information progresses through each layer, it gradually becomes more abstract, resembling the process that occurs in the visual cortex of our brains.

Almost 11 million internet-exposed SSH servers are vulnerable to the Terrapin attack that threatens the integrity of some SSH connections.

The Terrapin attack targets the SSH protocol, affecting both clients and servers, and was developed by academic researchers from Ruhr University Bochum in Germany.

It manipulates sequence numbers during the handshake process to compromise the integrity of the SSH channel, particularly when specific encryption modes like ChaCha20-Poly1305 or CBC with Encrypt-then-MAC are used.

In the modern digital age, where data flows freely and sensitive information is constantly in transit, secure communication has become essential. Traditional encryption methods, while effective, are not immune to the evolving threat landscape. This is where quantum key distribution (QKD) emerges as a revolutionary solution, offering unmatched security for transmitting sensitive data.

Image Credit: asharkyu/Shutterstock.com

The idea of quantum key distribution (QKD) dates back to Stephen Wiesner’s concept of quantum conjugate coding at Columbia University in the 1970s. Charles H. Bennett later built on this idea, introducing the first QKD protocol, BB84, in the 1980s, using nonorthogonal states. Since then, it has matured into one of the most established quantum technologies, commercially available for over 15 years.

When the theoretical physicist Leonard Susskind encountered a head-scratching paradox about black holes, he turned to an unexpected place: computer science. In nature, most self-contained systems eventually reach thermodynamic equilibrium… but not black holes. The interior volume of a black hole appears to forever expand without limit. But why? Susskind had a suspicion that a concept called computational complexity, which underpins everything from cryptography to quantum computing to the blockchain and AI, might provide an explanation.

He and his colleagues believe that the complexity of quantum entanglement continues to evolve inside a black hole long past the point of what’s called “heat death.” Now Susskind and his collaborator, Adam Brown, have used this insight to propose a new law of physics: the second law of quantum complexity, a quantum analogue of the second law of thermodynamics.

Also appearing in the video: Xie Chen of CalTech, Adam Bouland of Stanford and Umesh Vazirani of UC Berkeley.

00:00 Intro to a second law of quantum complexity.

Quantum computers have the potential to outperform conventional computers on some tasks, including complex optimization problems. However, quantum computers are also vulnerable to noise, which can lead to computational errors.

Engineers have been trying to devise fault-tolerant approaches that could be more resistant to noise and could thus be scaled up more robustly. One common approach to attain fault-tolerance is the preparation of magic states, which introduce so-called non-Clifford gates.

Researchers at University of Science and Technology of China, the Henan Key Laboratory of Quantum Information and Cryptography and the Hefei National Laboratory recently demonstrated the preparation of a logical magic state with fidelity beyond the distillation threshold on a superconducting quantum processor. Their paper, published in Physical Review Letters, outlines a viable and effective strategy to generate high-fidelity logical magic states, an approach to realize fault-tolerant quantum computing.

Computer-generated holography (CGH) represents a cutting-edge technology that employs computer algorithms to dynamically reconstruct virtual objects. This technology has found extensive applications across diverse fields such as three-dimensional display, optical information storage and processing, entertainment, and encryption.

Despite the broad application spectrum of CGH, contemporary techniques predominantly rely on projection devices like spatial light modulators (SLMs) and digital micromirror devices (DMDs). These devices inherently face limitations in display capabilities, often resulting in narrow field-of-view and multilevel diffraction in projected images.

In recent developments, metasurfaces composed of an array of subwavelength nanostructures have demonstrated exceptional capabilities in modulating electromagnetic waves. By introducing abrupt changes to fundamental wave properties like amplitude and phase through nanostructuring at subwavelength scales, metasurfaces enable modulation effects that are challenging to achieve with traditional devices.