Toggle light / dark theme

We probably think we know gravity pretty well. After all, we have more conscious experience with this fundamental force than with any of the others (electromagnetism and the weak and strong nuclear forces). But even though physicists have been studying gravity for hundreds of years, it remains a source of mystery.

In our video Why Is Gravity Different? We explore why this force is so perplexing and why it remains difficult to understand how Einstein’s general theory of relativity (which covers gravity) fits together with quantum mechanics.

Gravity is extraordinarily weak and nearly impossible to study directly at the quantum level. We cannot scrutinize it using particle accelerators like we can with the other forces, so we need other ways to get at quantum gravity.

Researchers at MIT and the University of Waterloo have developed a high-power, portable version of a device called a quantum cascade laser, which can generate terahertz radiation outside of a laboratory setting. The laser could potentially be used in applications such as pinpointing skin cancer and detecting hidden explosives.

Until now, generation of powerful enough to perform real-time imaging and fast spectral measurements required temperatures far below 200 Kelvin (−100 degrees Fahrenheit) or lower. These temperatures could only be achieved with bulky equipment that limited the technology’s use to a laboratory setting. In a paper published in Nature Photonics, MIT Distinguished Professor of Electrical Engineering and Computer Sciences Qing Hu and his colleagues report that their terahertz can function at temperatures of up to 250 K (−10 degrees Fahrenheit), meaning that only a compact portable cooler is required.

Terahertz quantum cascade lasers, tiny chip-embedded semiconductor laser devices, were first invented in 2002, but adapting them to operate far above 200 K proved to be so difficult that many people in the field speculated that there was a fundamental physical reason preventing it, Hu says.

Borrowing a page from high-energy physics and astronomy textbooks, a team of physicists and computer scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has successfully adapted and applied a common error-reduction technique to the field of quantum computing.

In the world of subatomic particles and giant particle detectors, and distant galaxies and giant telescopes, scientists have learned to live, and to work, with uncertainty. They are often trying to tease out ultra-rare particle interactions from a massive tangle of other particle interactions and background “noise” that can complicate their hunt, or trying to filter out the effects of atmospheric distortions and interstellar dust to improve the resolution of astronomical imaging.

Also, inherent problems with detectors, such as with their ability to record all particle interactions or to exactly measure particles’ energies, can result in data getting misread by the electronics they are connected to, so scientists need to design complex filters, in the form of computer algorithms, to reduce the margin of error and return the most accurate results.

For those that like really fast personal computers. 😃

Considering how powerful computers nowadays need to be, I think everyone will benefit overall.


These workloads are comprised of a fixed amount of work, so we can plot the task energy against the time required to finish the job (bottom axis), thus generating a really useful power chart. Bear in mind that faster compute times, and lower task energy requirements, are ideal.

This measure really separates the wheat from the chaff, and the best results fall to the lower left-hand corner of the chart. The Intel chips populate the less-desirable upper right-hand side. Although the Core i9-10980XE makes a valiant attempt to get down to Ryzen territory, it still can’t match the previous-gen Ryzen 3000 processors in terms of efficiency. Meanwhile, the Ryzen 5000 series leverages the Zen 3 architecture to great effect and falls further inside the performance-per-watt sweet spot, marking a new level of efficiency for a modern desktop chip.

The first quantum revolution brought about semiconductor electronics, the laser and finally the internet. The coming, second quantum revolution promises spy-proof communication, extremely precise quantum sensors and quantum computers for previously unsolvable computing tasks. But this revolution is still in its infancy. A central research object is the interface between local quantum devices and light quanta that enable the remote transmission of highly sensitive quantum information. The Otto-Hahn group “Quantum Networks” at the Max-Planck-Institute of Quantum Optics in Garching is researching such a “quantum modem”. The team has now achieved a first breakthrough in a relatively simple but highly efficient technology that can be integrated into existing fiber optic networks. The work is published this week in Physical Review X.

The Corona pandemic is a daily reminder of how important the internet has become. The World Wide Web, once a by-product of basic physical research, has radically changed our culture. Could a quantum internet become the next major innovation out of physics?

It is still too early to answer that question, but basic research is already working on the quantum internet. Many applications will be more specialized and less sensual than video conferencing, but the importance of absolutely spy-proof long-distance communication is understandable to everyone. “In the future, a quantum internet could be used to connect quantum computers located in different places,” Andreas Reiserer says, “which would considerably increase their computing power!” The physicist heads the independent Otto-Hahn research group “Quantum Networks” at the Max-Planck-Institute of Quantum Optics in Garching.

SpaceX’s Starlink satellite cluster has been receiving much headline space recently as it continues adding satellites at a breathtaking pace. Much of this news coverage has focused on how it’s impacting amateur skygazers and how it could benefit people in far-flung regions. But technical details do matter, and over on Casey Handmer’s blog, there was a recent discussion of one of the most important aspects of how Starlink actually operates—what will it do with its data?

In networking lingo, data is quantized into “packets,” which are sets of ones and zeros that computers can understand. In the case of Starlink, these packets will bounce between and a series of satellites parked in nine separate low-Earth orbits. Each orbit will contain a number of satellites, and each satellite’s covered territory will overlap with the satellites to the north and south of it. When the constellation is complete, every spot on Earth will be covered by at least two Starlink satellites.

Future versions of the satellites will use lasers to communicate amongst themselves. But for now, they have to use ground stations to talk to other satellites. Therefore, there will be a great amount of packet passing between satellites, ground stations and end user terminals. The information that describes such a convoluted path for each packet must be stored somewhere. That somewhere is called the “metadata.”

For those who are excited about 6G. 😃


Electromagnetic waves are characterized by a wavelength and a frequency; the wavelength is the distance a cycle of the wave covers (peak to peak or trough to trough, for example), and the frequency is the number of waves that pass a given point in one second. Cellphones use miniature radios to pick up electromagnetic signals and convert those signals into the sights and sounds on your phone.

4G wireless networks run on millimeter waves on the low- and mid-band spectrum, defined as a frequency of a little less (low-band) and a little more (mid-band) than one gigahertz (or one billion cycles per second). 5G kicked that up several notches by adding even higher frequency millimeter waves of up to 300 gigahertz, or 300 billion cycles per second. Data transmitted at those higher frequencies tends to be information-dense—like video—because they’re much faster.

The 6G chip kicks 5G up several more notches. It can transmit waves at more than three times the frequency of 5G: one terahertz, or a trillion cycles per second. The team says this yields a data rate of 11 gigabits per second. While that’s faster than the fastest 5G will get, it’s only the beginning for 6G. One wireless communications expert even estimates 6G networks could handle rates up to 8,000 gigabits per second; they’ll also have much lower latency and higher bandwidth than 5G.

In the distant past, there was a proverbial “digital divide” that bifurcated workers into those who knew how to use computers and those who didn’t.[1] Young Gen Xers and their later millennial companions grew up with Power Macs and Wintel boxes, and that experience made them native users on how to make these technologies do productive work. Older generations were going to be wiped out by younger workers who were more adaptable to the needs of the modern digital economy, upending our routine notion that professional experience equals value.

Of course, that was just a narrative. Facility with using computers was determined by the ability to turn it on and log in, a bar so low that it can be shocking to the modern reader to think that a “divide” existed at all. Software engineering, computer science and statistics remained quite unpopular compared to other academic programs, even in universities, let alone in primary through secondary schools. Most Gen Xers and millennials never learned to code, or frankly, even to make a pivot table or calculate basic statistical averages.

There’s a sociological change underway though, and it’s going to make the first divide look quaint in hindsight.

Researchers at the University of Rochester and Cornell University have taken an important step toward developing a communications network that exchanges information across long distances by using photons, mass-less measures of light that are key elements of quantum computing and quantum communications systems.

The research team has designed a nanoscale node made out of magnetic and semiconducting materials that could interact with other nodes, using laser light to emit and accept photons.

The development of such a quantum network—designed to take advantage of the physical properties of light and matter characterized by quantum mechanics—promises faster, more efficient ways to communicate, compute, and detect objects and materials as compared to networks currently used for computing and communications.