Toggle light / dark theme

Get the latest international news and world events from around the world.

Log in for authorized contributors

DARPA’s Laser Leap Proves “Energy can fly like data” as 800-Watt Beam Sets Distance Record and Opens Door to UAV and Space Uses

IN A NUTSHELL 🚀 A DARPA-led team set a new record by transmitting 800 watts over 5.3 miles using optical power beaming. ⚡ Power beaming could revolutionize energy delivery to remote locations and reduce logistical challenges. 🔬 The breakthrough involved a customized receiver and a high-energy optical laser to maximize efficiency. 🌍 Future phases aim

Passive Demultiplexed Two-photon State Generation from a Quantum Dot

High-purity multi-photon states are essential for photonic quantum computing. Among existing platforms, semiconductor quantum dots offer a promising route to scalable and deterministic multi-photon state generation. However, to fully realize their potential we require a suitable optical excitation method. Current approaches of multi-photon generation rely on active polarization-switching elements (e.g., electro-optic modulators, EOMs) to spatio-temporally demultiplex single photons. Yet, the achievable multi-photon rate is fundamentally limited by the switching speed of the EOM. Here, we introduce a fully passive demultiplexing technique that leverages a stimulated two-photon excitation process to achieve switching rates that are only limited by the quantum dot lifetime. We demonstrate this method by generating two-photon states from a single quantum dot without requiring any active switching elements. Our approach significantly reduces the cost of demultiplexing while shifting it to the excitation stage, enabling loss-free demultiplexing and effectively doubling the achievable multi-photon generation rate when combined with existing active demultiplexing techniques.

I Introduction.

Photonic quantum computing offers a unique advantage over other quantum platforms due to the long coherence time of photons, enabling robust quantum communication, quantum information processing, and quantum simulations. A critical requirement for these applications is the reliable generation of high-purity multi-photon states, i.e., nn indistinguishable photons in nn spatial modes – which serve as fundamental building blocks for quantum algorithms, error correction, quantum simulations, and advanced photonic networks. Multi-photon states are also essential for probing quantum optical phenomena such as multi-photon interference. The most widely used sources to produce multi-photon quantum states are the ones relying on parametric down-conversion or four wave mixing in nonlinear crystals. However, the scalability here is limited, due to the probabilistic nature of photon emission and the required resource overhead for computing and boson sampling applications.

Constructor Theory Explains Origin of Time

Go to https://ground.news/sabine to get 40% off the Vantage plan and see through sensationalized reporting. Stay fully informed on events around the world with Ground News.

Most physicists believe that time fundamentally doesn’t exist, because the concept of time is incompatible with a model of physics where quantum mechanics and general relativity coexist. David Deutsch and Chiara Marletto have now shown that “constructor theory” can be used to construct time. Let’s take a look.

Paper: https://arxiv.org/pdf/2505.

🤓 Check out my new quiz app ➜ http://quizwithit.com/
📚 Buy my book ➜ https://amzn.to/3HSAWJW
💌 Support me on Donorbox ➜ https://donorbox.org/swtg.
📝 Transcripts and written news on Substack ➜ https://sciencewtg.substack.com/
👉 Transcript with links to references on Patreon ➜ / sabine.
📩 Free weekly science newsletter ➜ https://sabinehossenfelder.com/newsle… Audio only podcast ➜ https://open.spotify.com/show/0MkNfXl… 🔗 Join this channel to get access to perks ➜ / @sabinehossenfelder #science #sciencenews #physics #time.
👂 Audio only podcast ➜ https://open.spotify.com/show/0MkNfXl
🔗 Join this channel to get access to perks ➜
/ @sabinehossenfelder.

#science #sciencenews #physics #time

Practical blueprint for low-depth photonic quantum computing with quantum dots

Abstract:

Fusion-based quantum computing is an attractive model for fault-tolerant computation based on photonics requiring only finite-sized entangled resource states followed by linear-optics operations and photon measurements. Large-scale implementations have so far been limited due to the access only to probabilistic photon sources, vulnerability to photon loss, and the need for massive multiplexing. Deterministic photon sources offer an alternative and resource-efficient route. By synergistically integrating deterministic photon emission, adaptive repeat-until-success fusions, and an optimised architectural design, we propose a complete blueprint for a photonic quantum computer using quantum dots and linear optics. It features time-bin qubit encoding, reconfigurable entangled-photon sources, and a fusion-based architecture with low optical connectivity, significantly reducing the required optical depth per photon and resource overheads. We present in detail the hardware required for resource-state generation and fusion networking, experimental pulse sequences, and exact resource estimates for preparing a logical qubit. We estimate that one logical clock cycle of error correction can be executed within microseconds, which scales linearly with the code distance. We also simulate error thresholds for fault-tolerance by accounting for a full catalogue of intrinsic error sources found in real-world quantum dot devices. Our work establishes a practical blueprint for a low-optical-depth, emitter-based fault-tolerant photonic quantum computer.


N2 — Fusion-based quantum computing is an attractive model for fault-tolerant computation based on photonics requiring only finite-sized entangled resource states followed by linear-optics operations and photon measurements. Large-scale implementations have so far been limited due to the access only to probabilistic photon sources, vulnerability to photon loss, and the need for massive multiplexing. Deterministic photon sources offer an alternative and resource-efficient route. By synergistically integrating deterministic photon emission, adaptive repeat-until-success fusions, and an optimised architectural design, we propose a complete blueprint for a photonic quantum computer using quantum dots and linear optics. It features time-bin qubit encoding, reconfigurable entangled-photon sources, and a fusion-based architecture with low optical connectivity, significantly reducing the required optical depth per photon and resource overheads. We present in detail the hardware required for resource-state generation and fusion networking, experimental pulse sequences, and exact resource estimates for preparing a logical qubit. We estimate that one logical clock cycle of error correction can be executed within microseconds, which scales linearly with the code distance. We also simulate error thresholds for fault-tolerance by accounting for a full catalogue of intrinsic error sources found in real-world quantum dot devices. Our work establishes a practical blueprint for a low-optical-depth, emitter-based fault-tolerant photonic quantum computer.

AB — Fusion-based quantum computing is an attractive model for fault-tolerant computation based on photonics requiring only finite-sized entangled resource states followed by linear-optics operations and photon measurements. Large-scale implementations have so far been limited due to the access only to probabilistic photon sources, vulnerability to photon loss, and the need for massive multiplexing. Deterministic photon sources offer an alternative and resource-efficient route. By synergistically integrating deterministic photon emission, adaptive repeat-until-success fusions, and an optimised architectural design, we propose a complete blueprint for a photonic quantum computer using quantum dots and linear optics. It features time-bin qubit encoding, reconfigurable entangled-photon sources, and a fusion-based architecture with low optical connectivity, significantly reducing the required optical depth per photon and resource overheads.

Aging Can Spread Through Your Body Via a Single Protein, Study Finds

Take note of the name: ReHMGB1. A new study pinpoints this protein as being able to spread the wear and tear that comes with time as it quietly travels through the bloodstream. This adds significantly to our understanding of aging.

Short for reduced high mobility group box 1, ReHMGB1 triggers senescence in cells, permanently disabling them. It doesn’t just do this locally; it can send damaging signals throughout the body, particularly in response to injuries or disease.

“An important question in aging research is why senescent cells increase with age,” write the study authors, led by researchers from the Korea University College of Medicine.

Missing messenger RNA fragments could be key to new immunotherapy for hard-to-treat brain tumors

A new study, led by researchers at Children’s Hospital of Philadelphia (CHOP), identified tiny pieces of messenger RNA that are missing in pediatric high-grade glioma tumors but not in normal brain tissues. Preclinical research indicates that these missing RNA fragments can make difficult-to-treat tumors more responsive to immunotherapy. The findings were recently published in the journal Cell Reports.

One of the biggest challenges facing is the need to find safe and effective therapies for the most aggressive types of brain tumors. Adoptive immunotherapies with CAR-T cells are promising; however, they often also target , which share most surface proteins with . While this might be tolerable in patients with certain types of blood cancer, in the brain, wiping out healthy neurons is unacceptable. This means that deep knowledge of gene expression patterns exclusive to is critical.

A potential means of discovering new therapeutic targets for brain tumors may lie in , a process whereby a single gene produces multiple proteins by rearranging exons, the building blocks of messenger RNA, in different combinations. Researchers suspected that splicing in glioma cells may differ from splicing in normal brain cells, which could help devise new therapeutic interventions.

/* */