Toggle light / dark theme

Researchers from Austria and the U.S. have designed a new type of quantum computer that uses fermionic atoms to simulate complex physical systems. The processor uses programmable neutral atom arrays and is capable of simulating fermionic models in a hardware-efficient manner using fermionic gates.

The team led by Peter Zoller demonstrated how the new quantum processor can efficiently simulate fermionic models from quantum chemistry and particle physics. The paper is published in the journal Proceedings of the National Academy of Sciences.

Fermionic atoms are atoms that obey the Pauli exclusion principle, which means that no two of them can occupy the same simultaneously. This makes them ideal for simulating systems where fermionic statistics play a crucial role, such as molecules, superconductors and quark-gluon plasmas.

What happened before the Big Bang? In two of our previous films we examined cyclic cosmologies and time travel universe models. Specially, the Gott and Li Model https://www.youtube.com/watch?v=79LciHWV4Qs) and Penrose’s Conformal Cyclic Cosmology https://www.youtube.com/watch?v=FVDJJVoTx7s). Recently Beth Gould and Niayesh Afshordi of the Perimeter Institute for Theoretical Physics have fused these two models together to create a startling new vision of the universe. In this film they explain their new proposal, known as Periodic Time Cosmology.

0:00 Introduction.
0:45 NIayesh’s story.
1:15 Beth’s story.
2:25 relativity.
3:26 Gott & Li model.
6:23 origins of the PTC model.
8:17 PTC periodic time cosmology.
10:55 Penrose cyclic model.
13:01 Sir Roger Penrose.
14:19 CCC and PTC
15:45 conformal rescaling and the CMB
17:28 assumptions.
18:41 why a time loop?
20:11 empirical test.
23:96 predcitions.
26:19 inflation vs PTC
30:22 gravitational waves.
31:40 cycles and the 2nd law.
32:54 paradoxes.
34:08 causality.
35:17 immortality in a cyclic universe.
38:02 eternal return.
39:21 quantum gravity.
39:57 conclusion.

Elizabeth Gould has asked to make this clarification in the written text ” “Despite the availability of infinite time in the periodic time model, this doesn’t lead to thermalization in a typical time-evolution scenario, and therefore doesn’t, strictly speaking, solve the problem related to thermalization in the power spectrum. The reason for this is that, unlike bounce models with a net expansion each cycle, our model has an effective contraction during the conformal phases. Periodic time, therefore, has a unique character in which it reuses the power spectrum from the previous cycles, which is confined to a given form due to the constraints of the system, rather than removing the old power spectrum and needing to produce a new one.”

Since the start of the quantum race, Microsoft has placed its bets on the elusive but potentially game-changing topological qubit. Now the company claims its Hail Mary has paid off, saying it could build a working processor in less than a decade.

Today’s leading quantum computing companies have predominantly focused on qubits—the quantum equivalent of bits—made out of superconducting electronics, trapped ions, or photons. These devices have achieved impressive milestones in recent years, but are hampered by errors that mean a quantum computer able to outperform classical ones still appears some way off.

Microsoft, on the other hand, has long championed topological quantum computing. Rather than encoding information in the states of individual particles, this approach encodes information in the overarching structure of the system. In theory, that should make the devices considerably more tolerant of background noise from the environment and therefore more or less error-proof.

The teams pitted IBM’s 127-qubit Eagle chip against supercomputers at Lawrence Berkeley National Lab and Purdue University for increasingly complex tasks. With easier calculations, Eagle matched the supercomputers’ results every time—suggesting that even with noise, the quantum computer could generate accurate responses. But where it shone was in its ability to tolerate scale, returning results that are—in theory—far more accurate than what’s possible today with state-of-the-art silicon computer chips.

At the heart is a post-processing technique that decreases noise. Similar to looking at a large painting, the method ignores each brush stroke. Rather, it focuses on small portions of the painting and captures the general “gist” of the artwork.

The study, published in Nature, isn’t chasing quantum advantage, the theory that quantum computers can solve problems faster than conventional computers. Rather, it shows that today’s quantum computers, even when imperfect, may become part of scientific research—and perhaps our lives—sooner than expected. In other words, we’ve now entered the realm of quantum utility.

The prospect of a quantum internet, connecting quantum computers and capable of highly secure data transmission, is enticing, but making it poses a formidable challenge. Transporting quantum information requires working with individual photons rather than the light sources used in conventional fiber optic networks.

To produce and manipulate , scientists are turning to quantum light emitters, also known as . These atomic-scale defects in semiconductor materials can emit single photons of fixed wavelength or color and allow photons to interact with electron spin properties in controlled ways.

A team of researchers has recently demonstrated a more effective technique for creating quantum emitters using pulsed ion beams, deepening our understanding of how are formed. The work was led by Department of Energy Lawrence Berkeley National Laboratory (Berkeley Lab) researchers Thomas Schenkel, Liang Tan, and Boubacar Kanté who is also an associate professor of electrical engineering and computer sciences at the University of California, Berkeley.

The researchers had to look at light mechanically to begin seeing similarities in properties usually seen in quantum states.

In 1,673, Christiaan Huygens wrote a book on pendulums and how they work. A mechanical theorem mentioned in the book was used 350 years later by researchers at the Stevens Institute of Technology to explain the complex behaviors of light, a university statement said.

Although known to us for eons, humanity has found it difficult to explain the very nature of light. For centuries scientists have been divided on whether to call it a wave or a particle and when there seemed to be some agreement on what light could actually be, quantum physics threw a new curveball by suggesting that it existed as both at once.

You may have heard of light as both particles and waves, but have you ever imagined the secret dance within? Researchers from the University of Ottawa and Sapienza University in Rome have just uncovered a groundbreaking technique that enables the real-time visualization of the wave function of entangled photons — the fundamental components of light.

Imagine choosing a random shoe from a pair. If it’s a “left” shoe, you immediately know the other shoe you’ve yet to unbox is meant to go on your right foot. This instantaneous information is certain whether the shoe box is within hand’s reach or 4.3 light-years away on some planet in the Alpha Centauri system.

This analogy, though not perfect, captures the essence of quantum entanglement. At its core, quantum entanglement refers to the phenomenon where two or more particles become deeply interconnected in such a way that their properties become correlated, regardless of the spatial separation between them. This means that the state of one particle instantly influences the state of another, even if they are light-years apart.

The ancient Greek philosopher Aristotle wrote in his manuscript on Physics 2,373 years ago: “If everything that exists has a place, place too will have a place, and so on ad infinitum.” Is the notion of space being continuous ‘without limit’ justified?

Before elementary particles were discovered, water was thought to be a continuous fluid. This is a good approximation on large scales but not on molecular scales where the interactions among elementary particles matter.

Similarly, spacetime has been thought to be a continuum since ancient times. While this notion appears consistent with all experimental data on large spatial or temporal scales, it may not be valid on tiny scales where quantum effects of gravity matter. An analogy can be made with the illusion of a movie which appears continuous when the frame rate is high enough and the spatial pixels are small enough for our brain to process the experience as seamless. Since our brain is made of elementary particles, the temporal and spatial resolution by which it senses reality is coarser by many orders of magnitude than any fundamental scale by which spacetime is discretized.

Photonic qudits are emerging as an essential resource for environment-resilient quantum key distribution, quantum simulation and quantum imaging and metrology1. The availability of unbounded photonic degrees of freedom, such as time-bins, temporal modes, orbital angular momentum (OAM) and radial number1, allows for encoding large amounts of information in fewer photons than would be required by qubit-based protocols (for example, when using only polarization). At the same time, the large dimensionality of these states, such as those emerging from the generation of photon pairs, poses an intriguing challenge for what concerns their measurement. The number of projective measurements necessary for a full-state tomography scales quadratically with the dimensionality of the Hilbert space under consideration2. This issue can be tackled with adaptive tomographic approaches3,4,5 or compressive techniques6,7, which are, however, constrained by a priori hypotheses on the quantum state under study. Moreover, quantum state tomography via projective measurement becomes challenging when the dimension of the quantum state is not a power of a prime number8. Here we try to tackle the tomographic challenge, in the specific contest of spatially correlated biphoton states, looking for an interferometric approach inspired by digital holography9,10,11, familiar in classical optics. We show that the coincidence imaging of the superposition of two biphoton states, one unknown and one used as a reference state, allows retrieving the spatial distribution of phase and amplitude of the unknown biphoton wavefunction. Coincidence imaging can be achieved with modern electron-multiplying charged coupled device cameras12,13, single photon avalanche diode arrays14,15,16 or time-stamping cameras17,18. These technologies are commonly exploited in quantum imaging, such as ghost imaging experiments19 or quantum super-resolution20,21, as well as for fundamental applications, including characterizing two-photon correlations13,22, imaging of high-dimensional Hong–Ou–Mandel interference23,24,25, and visualization of the violation of Bell inequalities26. Holography techniques have been recently proposed in the context of quantum imaging27,28,29; demonstrating the phase-shifting digital holography in a coincidence imaging regime using polarization entanglement27, and exploiting induced coherence, that is, the reconstruction of phase objects through digital holography of undetected photons28.

In this work, we focus on the specific problem of reconstructing the quantum state (in the transverse coordinate basis) of two photons emerging from degenerate spontaneous parametric down-conversion (SPDC). These states are characterized by strong correlations in the transverse position (considered on the plane where the two-photon generation happens), which can be observed in other kinds of photon sources such as cold atoms30. In these sources, the two-photon wavefunction strongly depends on the shape of the pump laser used to induce the down-conversion process31. The most commonly used approach in the literature to reconstruct the biphoton state emitted by a nonlinear crystal is based on projective techniques32,33,34. This method has drawbacks concerning measurement times (as it needs successive measurements on non-orthogonal bases) and the signal loss due to diffraction. We proposed an imaging-based procedure capable of overcoming both of the issues mentioned above, while giving the full-state reconstruction of the unknown state. The core idea lies in assuming the SPDC state induced by a plane wave as known, and in superimposing this state with the unknown biphoton state. Unless the superposition is achieved directly on the crystal plane, a full analysis of the four-dimensional distribution of coincidences is necessary to retrieve the interference between the two wavefunctions. This information can be visualized by observing coincidence images, defined as marginals of the coincidence distribution obtained integrating over the coordinates of one of the two photons. In fact, obtaining coincidence images after post-selecting specific spatial correlations allows retrieval of the phase information, likewise in cases in which the state does not exhibit sharp spatial correlations. We demonstrate this technique for pump beams in different spatial modes, including Laguerre–Gaussian (LG) and Hermite–Gaussian (HG) modes. We investigate several physical effects from the reconstructed states, such as OAM conservation, the generation of high-dimensional Bell states, parity conservation and radial correlations. Remarkably, we show how, from a simple measurement, one can retrieve information about two-photon states in arbitrary spatial mode bases without the efficiency and alignment issues that affect previously implemented projective characterization techniques. Depending on the source brightness and the required number of detection events, the measurement time can be of the order of tens of seconds, whereas the previously implemented projective techniques required several hours and were limited to the exploration of a small subspace of spatial modes. As a latter example, we give a proof of principle demonstration of the use of this technique for quantum imaging applications.