Toggle light / dark theme

The formation of a black hole from light alone is permitted by general relativity, but a new study says quantum physics rules it out.

Black holes are known to form from large concentrations of mass, such as burned-out stars. But according to general relativity, they can also form from ultra-intense light. Theorists have speculated about this idea for decades. However, calculations by a team of researchers now suggest that light-induced black holes are not possible after all because quantum-mechanical effects cause too much leakage of energy for the collapse to proceed [1].

The extreme density of mass produced by a collapsed star can curve spacetime so severely that no light entering the region can escape. The formation of a black hole from light is possible according to general relativity because mass and energy are equivalent, so the energy in an electromagnetic field can also curve spacetime [2]. Putative electromagnetic black holes have become popularly known as kugelblitze, German for “ball lightning,” following the terminology used by Princeton University physicist John Wheeler in early studies of electromagnetically generated gravitational fields in the 1950s [3].

In most neutral-atom quantum computers, atoms are held in arrays of optical tweezers. Researchers typically populate the arrays stochastically, meaning that whether a given site receives an atom is down to chance. Atoms can later be rearranged individually, but the total number of atoms depends on the success of the initial loading.

The Atom Computing team developed an iterative process to fill an array to capacity. Instead of filling the array directly, the researchers first stochastically populated a second “reservoir” array. They then transferred atoms one by one from this reservoir to the target array using an optical tweezer. Between each loading step, the researchers imaged both arrays to determine which sites in each array were occupied. This step required temporarily switching off the tweezers and holding the atoms in an optical lattice formed from interfering laser beams.

The researchers showed that this sequence could be repeated as many times as necessary without losing atoms from the target array. They also showed that they could limit atom loss during the imaging step by enhancing the lattice strength using optical cavities. This enhancement allowed the atoms to be more strongly confined without increasing the optical lattice’s laser-power requirements.

The idea of the brain as a computer is everywhere. So much so we have forgotten it is a model and not the reality. It’s a metaphor that has lead some to believe that in the future they’ll be uploaded to the digital ether and thereby achieve immortality. It’s also a metaphor that garners billions of dollars in research funding every year. Yet researchers argue that when we dig down into our grey matter our biology is anything but algorithmic. And increasingly, critics contend that the model of the brain as computer is sending scientists (and their resources) nowhere fast. Is our attraction to the idea of the brain as computer an accident of current human technology? Can we find a better metaphor that might lead to a new paradigm?

If there’s one phrase the June 2024 U.S. presidential debate may entirely eliminate from the English vocabulary it’s that age is a meaningless number. Often attributed to boxer Muhammad Ali, who grudgingly retired at age 39, this centuries-old idea has had far-reaching consequences in global politics, as life expectancy more than doubled since the start of the 20th century, and presidents’ ages shifted upwards. We say “age is what we make of it” to ourselves and to policymakers, and think it’s a harmless way to dignify the aged. But how true is it? And if it isn’t true, why would we lie?

For centuries, we have confused our narrative of what aging should be with what its ruthless biology is. Yet pretending that biological age does not matter is at best myopic, and at worst, it’s a dangerous story to our governments, families, and economies. In just 11 years — between 2018 and 2029 — U.S. spending on Social Security and Medicare will more than double, from $1.3 trillion to $2.7 trillion per year. As we age, our odds of getting sick and dying by basically anything go up exponentially. If smoking increases our chances of getting cancer by a factor of 15, aging does so 100-fold. At age 65, less than 5% of people are diagnosed with Alzheimer’s. Beyond age 85, nearly half the population has some form of dementia. Biological aging is the biggest risk factor for most chronic diseases; it’s a neglected factor in global pandemics; and it even plays a role in rare diseases.

This explains why in hospitals, if there’s one marker next to a patient’s name, it’s their age. How many birthday candles we have blown out is an archaic surrogate marker of biological aging. Yet it’s the best we have. Chronological age is so telling of overall health that physicians everywhere rely on it for life-or-death decisions, from evaluating the risks of cancer screening to rationing hospital beds.