Toggle light / dark theme

While classical physics presents a deterministic universe where cause must precede effect, quantum mechanics and relativity theory paint a more nuanced picture. There are already well-known examples from relativity theory like wormholes, which are valid solutions of Einstein’s Field Equations, and similarly in quantum mechanics the non-classical state of quantum entanglement—the “spooky action at a distance” that troubled Einstein—which demonstrates that quantum systems can maintain instantaneous correlations across space and, potentially, time.

Perhaps most intriguingly, the protocol suggests that quantum entanglement can be used to effectively send information about optimal measurement settings “back in time”—information that would normally only be available after an experiment is complete. This capability, while probabilistic in nature, could revolutionize quantum computing and measurement techniques. Recent advances in multipartite hybrid entanglement even suggest these effects might be achievable in real-world conditions, despite environmental noise and interference. The realization of such a retrocausal quantum computational network would, effectively, be the construction of a time machine, defined in general as a system in which some phenomenon characteristic only of chronology violation can reliably be observed.

This article explores the theoretical foundations, experimental proposals, significant improvements, and potential applications of the retrocausal teleportation protocol. From its origins in quantum mechanics and relativity theory to its implications for our understanding of causality and the nature of time itself, we examine how this cutting-edge research challenges our classical intuitions while opening new possibilities for quantum technology. As we delve into these concepts, we’ll see how the seemingly fantastic notion of time travel finds a subtle but profound expression in the quantum realm, potentially revolutionizing our approach to quantum computation and measurement while deepening our understanding of the universe’s temporal fabric.

The brain is sometimes called the most complex machine in the known universe. But the thoughts that it outputs putter along at a trifling 10 bits per second, the pace of a conversation.

By Rachel Nuwer

People tend to have the sense that their inner thoughts and feelings are much richer than what they are capable of expressing in real time. Elon Musk has spoken publicly about this “bandwidth problem,” as he described it to podcaster Joe Rogan. Musk is so bothered by this, in fact, that he has made it one of his long-term goals to create an interface that allows the human brain to communicate directly with a computer, unencumbered by the slow speed of speaking or writing.

Researchers at the University of Virginia have made significant advancements in understanding how heat flows through thin metal films, critical for designing more efficient computer chips.

This study confirms Matthiessen’s rule at the nanoscale, enhancing heat management in ultra-thin copper films used in next-generation devices, thereby improving performance and sustainability.

Breakthrough in Chip Technology.

Caltech researchers have quantified the speed of human thought: a rate of 10 bits per second. However, our bodies’ sensory systems gather data about our environments at a rate of a trillion bits per second, which is 100 million times faster than our thought processes. This new study raises major new avenues of exploration for neuroscientists, in particular: Why can we only think one thing at a time while our sensory systems process thousands of inputs at once?

The research was conducted in the laboratory of Markus Meister, the Anne P. and Benjamin F. Biaggini Professor of Biological Sciences, and it was led by graduate student Jieyu Zheng. A paper describing the study appears in the journal Neuron.

A bit is a basic unit of information in computing. A typical Wi-Fi connection, for example, can process 50 million bits per second. In the new study, Zheng applied techniques from the field of information theory to a vast amount of scientific literature on human behaviors such as reading and writing, playing video games, and solving Rubik’s Cubes, to calculate that humans think at a speed of 10 bits per second.

Leveraging the principles of quantum mechanics, quantum computers can perform calculations at lightning-fast speeds, enabling them to solve complex problems faster than conventional computers. In quantum technology applications such as quantum computing, light plays a central role in encoding and transmitting information.

NTU researchers have recently made breakthroughs in manipulating light that could potentially usher in the era of . Details of this research have been published in Nature Photonics, Physical Review Letters, and Nature Communications.

World renowned neurophysiologist and computational neuroscientist Christof Koch joins Brian Greene to discuss how decades of experimental and theoretical investigation have shaped his understanding of consciousness and the brain — and how recent psychedelic experiences have profoundly reshaped his perspective on life and death.

This program is part of the Big Ideas series, supported by the John Templeton Foundation.

Participant: Christof Koch.
Moderator: Brian Greene.

00:00 — Introduction.

Quantum computing and networking company IonQ has delivered a data center-ready trapped-ion quantum computer to the uptownBasel innovation campus in Arlesheim, Switzerland.

The IonQ Forte Enterprise quantum computer is the first of its kind to operate outside the United States and Switzerland’s first quantum computer designed for commercial use.

According to IonQ, Forte Enterprise is now online, servicing compute jobs while performing at a record algorithmic qubit count of #AQ36. The number of algorithmic qubits (#AQ) is a tool for showing how useful a quantum computer is at solving real problems for users by summarizing its ability to run benchmark quantum algorithms often used for applications.

A review of syntheticapertureradar image formation algorithms and implementations: a computational perspective.

✍️ Helena Cruz et al.


Designing synthetic-aperture radar image formation systems can be challenging due to the numerous options of algorithms and devices that can be used. There are many SAR image formation algorithms, such as backprojection, matched-filter, polar format, Range–Doppler and chirp scaling algorithms. Each algorithm presents its own advantages and disadvantages considering efficiency and image quality; thus, we aim to introduce some of the most common SAR image formation algorithms and compare them based on these two aspects. Depending on the requisites of each individual system and implementation, there are many device options to choose from, for instance, FPGAs, GPUs, CPUs, many-core CPUs, and microcontrollers. We present a review of the state of the art of SAR imaging systems implementations.

Researchers have developed a technique called “atomic spray painting” using molecular beam epitaxy to strain-tune potassium niobate, enhancing its ferroelectric properties.

This method allows precise manipulation of material properties, with potential applications in green technologies, quantum computing, and space exploration.

Material Strain Tuning