Strong field quantum optics is a rapidly emerging research topic, which merges elements of non-linear photoemission rooted in strong field physics with the well-established realm of quantum optics. While the distribution of light particles (i.e., photons) has been widely documented both in classical and non-classical light sources, the impact of such distributions on photoemission processes remains poorly understood.
Category: quantum physics – Page 161
Experiments generate quantum entanglement over optical fibres across three real cities, marking progress towards networks that could have revolutionary applications.
Quantum theorists at the University of British Columbia have proposed a new approach to studying stacking ferroelectricity—spontaneous electric polarization—in layered, two-dimensional lab-grown materials.
Perturbative expansion is a valuable mathematical technique which is widely used to break down descriptions of complex quantum systems into simpler, more manageable parts. Perhaps most importantly, it has enabled the development of quantum field theory (QFT): a theoretical framework that combines principles from classical, quantum, and relativistic physics, and serves as the foundation of the Standard Model of particle physics.
Researchers at the University of Bristol have made an important breakthrough in scaling quantum technology by integrating the world’s tiniest quantum light detector onto a silicon chip. The paper, “A Bi-CMOS electronic photonic integrated circuit quantum light detector,” was published in Science Advances.
For quantum computers to go from research curiosities to practically useful devices, researchers need to get their errors under control. New research from Microsoft and Quantinuum has now taken a major step in that direction.
Today’s quantum computers are stuck firmly in the “noisy intermediate-scale quantum” (NISQ) era. While companies have had some success stringing large numbers of qubits together, they are highly susceptible to noise which can quickly degrade their quantum states. This makes it impossible to carry out computations with enough steps to be practically useful.
While some have claimed that these noisy devices could still be put to practical use, the consensus is that quantum error correction schemes will be vital for the full potential of the technology to be realized. But error correction is difficult in quantum computers because reading the quantum state of a qubit causes it to collapse.
Genomics is revolutionizing medicine and science, but current approaches still struggle to capture the breadth of human genetic diversity. Pangenomes that incorporate many people’s DNA could be the answer, and a new project thinks quantum computers will be a key enabler.
When the Human Genome Project published its first reference genome in 2001, it was based on DNA from just a handful of humans. While less than one percent of our DNA varies from person to person, this can still leave important gaps and limit what we can learn from genomic analyses.
That’s why the concept of a pangenome has become increasingly popular. This refers to a collection of genomic sequences from many different people that have been merged to cover a much greater range of human genetic possibilities.
The tension between quantum mechanics and relativity has long been a central split in modern-day physics. Developing a theory of quantum gravity remains one of the great outstanding challenges of the discipline. And yet, no one has yet been able to do it. But as we collect more data, it shines more light on the potential solution, even if some of that data happens to show negative results.
That happened recently with a review of data collected at IceCube, a neutrino detector located in the Antarctic ice sheet, and compiled by researchers at the University of Texas at Arlington. They looked for signs that gravity could vary even a minuscule amount based on quantum mechanical fluctuations. And, to put it bluntly, they didn’t find any evidence of that happening.
To check for these minuscule fluctuations, they analyzed more than 300,000 detected neutrinos that IceCube had captured. IceCube is an impressive engineering feat, with thousands of sensors buried over one sq km in the ice. When one of the detectors is triggered by one of a hundred trillions of neutrinos passing through it every second, data on whether it was affected by any perturbations in the local gravity of that area can be collected.
A British consortium with funding from the UK government has successfully tested what it calls “un-jammable” quantum navigation tech in flight.
Geopolitical tensions and warfare have introduced GPS jamming as a means of messing with enemy communication and navigation. This can cause disturbances for both military and civilian transportation and location services.
The quantum-based navigation system is called Positioning, Navigation, and Timing (PNT). Its developers are quantum technology firm Infleqtion’s UK subsidiary in collaboration with aerospace company BAE Systems and defence tech contractor QinetiQ, among others.
A quantum internet would essentially be unhackable. In the future, sensitive information—financial or national security data, for instance, as opposed to memes and cat pictures—would travel through such a network in parallel to a more traditional internet.
Of course, building and scaling systems for quantum communications is no easy task. Scientists have been steadily chipping away at the problem for years. A Harvard team recently took another noteworthy step in the right direction. In a paper published this week in Nature, the team says they’ve sent entangled photons between two quantum memory nodes 22 miles (35 kilometers) apart on existing fiber optic infrastructure under the busy streets of Boston.
“Showing that quantum network nodes can be entangled in the real-world environment of a very busy urban area is an important step toward practical networking between quantum computers,” Mikhail Lukin, who led the project and is a physics professor at Harvard, said in a press release.