Toggle light / dark theme

The Universe’s Topology May Not Be Simple

Most models for the overall shape and geometry of the Universe—including some exotic ones—are compatible with the latest cosmic observations.

Is the Universe simply connected like a sphere or does it contain holes like a doughnut or a more complicated structure? The topology of the Universe—that is, its overall geometry—remains far from settled, according to a collaboration of cosmologists. Despite past claims that observations of the cosmic microwave background (CMB) rule out various topologies, the researchers contend that many of these shapes, including some strange ones, have not been contradicted by the evidence [1].

The overall geometry of the Universe is thought to have been determined by quantum processes that unfolded in the initial moment of the big bang. Identifying the topology of the Universe would provide researchers with an important clue as to the nature of those quantum processes and could help them sift through the many proposed theories of the early Universe.

Compact Quantum Light Processing: Time-Bending Optical Computing Breakthrough

An international collaboration of researchers, led by Philip Walther at University of Vienna, have achieved a significant breakthrough in quantum technology, with the successful demonstration of quantum interference among several single photons using a novel resource-efficient platform. The work published in the prestigious journal Science Advances represents a notable advancement in optical quantum computing that paves the way for more scalable quantum technologies.

Interference among photons, a fundamental phenomenon in quantum optics, serves as a cornerstone of optical quantum computing. It involves harnessing the properties of light, such as its wave-particle duality, to induce interference patterns, enabling the encoding and processing of quantum information.

In traditional multi-photon experiments, spatial encoding is commonly employed, wherein photons are manipulated in different spatial paths to induce interference. These experiments require intricate setups with numerous components, making them resource-intensive and challenging to scale.

New method of measuring qubits promises ease of scalability in a microscopic package

Scaling up qubit counts in quantum computers is at the core of achieving quantum supremacy.


Among the troublesome hurdles of this scaling-up race is refining how qubits are measured. Devices called parametric amplifiers are traditionally used to do these measurements. But as the name suggests, the device amplifies weak signals picked up from the qubits to conduct the readout, which causes unwanted noise and can lead to decoherence of the qubits if not protected by additional large components. More importantly, the bulky size of the amplification chain becomes technically challenging to work around as qubit counts increase in size-limited refrigerators.

Cue the Aalto University research group Quantum Computing and Devices (QCD). They have a hefty track record of showing how thermal bolometers can be used as ultrasensitive detectors, and they just demonstrated in an April 10 Nature Electronics paper that bolometer measurements can be accurate enough for single-shot qubit readout.

Scientists tune the entanglement structure in an array of qubits

Entanglement is a form of correlation between quantum objects, such as particles at the atomic scale. The laws of classical physics cannot explain this uniquely quantum phenomenon, yet it is one of the properties that explain the macroscopic behavior of quantum systems.

Because entanglement is central to the way quantum systems work, understanding it better could give scientists a deeper sense of how information is stored and processed efficiently in such systems.

Qubits, or quantum bits, are the building blocks of a quantum computer. However, it is extremely difficult to make specific entangled states in many-qubit systems, let alone investigate them. There are also a variety of entangled states, and telling them apart can be challenging.

The Big Quantum Chill: NIST Scientists Modify Common Lab Refrigerator to Cool Faster With Less Energy

From stabilizing qubits (the basic unit of information in a quantum computer) to maintaining the superconducting properties of materials and keeping NASA’s James Webb Space Telescope cool enough to observe the heavens, ultracold refrigeration is essential to the operation of many devices and sensors. For decades, the pulse tube refrigerator (PTR) has been the workhorse device for achieving temperatures as cold as the vacuum of outer space.

These refrigerators cyclically compress (heat) and expand (cool) high pressure helium gas to achieve the “Big Chill,” broadly analogous to the way a household refrigerator uses the transformation of freon from liquid to vapor to remove heat. For more than 40 years, the PTR has proven its reliability, but it is also power-hungry, consuming more electricity than any other component of an ultralow temperature experiment.

Demonstration of heralded three-photon entanglement on a photonic chip

Photonic quantum computers are computational tools that leverage quantum physics and utilize particles of light (i.e., photons) as units of information processing. These computers could eventually outperform conventional quantum computers in terms of speed, while also transmitting information across longer distances.

Despite their promise, photonic quantum computers have not yet reached the desired results, partly due to the inherently weak interactions between individual photons. In a paper published in Physical Review Letters, researchers at University of Science and Technology of China demonstrated a large cluster state that could facilitate quantum computation in a photonic system, namely three-photon entanglement.

“Photonic quantum computing holds promise due to its operational advantages at and minimal decoherence,” Hui Wang, co-author of the paper, told Phys.org.

/* */