Toggle light / dark theme

Researchers in Japan have developed a diamond FET with high hole mobility.


In the 1970s, Stephen Hawking found that an isolated black hole would emit radiation but only when considered quantum mechanics. This is known as black hole evaporation because the black hole shrinks. However, this led to the black hole information paradox.

If the black hole evaporates entirely, physical information would permanently disappear in a black hole. However, this violates a core precept of quantum physics: the information cannot vanish from the Universe.

A new study by an international quartet of physicists suggests that black holes are more complex than originally understood. They have a gravitational field that, at the quantum level, encodes information about how they were formed.

If the black hole evaporates entirely, physical information would permanently disappear in a black hole. However, this violates a core precept of quantum physics: the information cannot vanish from the Universe.

A new study by an international quartet of physicists suggests that black holes are more complex than originally understood. They have a gravitational field that, at the quantum level, encodes information about how they were formed.

The research team includes Professor Xavier Calmet from the University of Sussex School of Mathematical and Physical Sciences, Professor Roberto Casadio (INFN, University of Bologna), Professor Stephen Hsu (Michigan State University), along with Ph.D. student Folkert Kuipers (University of Sussex). Their study significantly improves understanding of black holes and resolves a problem that has confounded scientists for nearly half a century; the black hole information paradox.

How can Einstein’s theory of gravity be unified with quantum mechanics? This is a challenge that could give us deep insights into phenomena such as black holes and the birth of the universe. Now, a new article in Nature Communications, written by researchers from Chalmers University of Technology, Sweden, and MIT, USA, presents results that cast new light on important challenges in understanding quantum gravity. Credit: Chalmers University of Technology / Yen Strandqvist.

How can Einstein’s theory of gravity be unified with quantum mechanics? It is a challenge that could give us deep insights into phenomena such as black holes and the birth of the universe. Now, a new article in Nature Communications, written by researchers from Chalmers University of Technology 0, Sweden, and MIT 0, USA, presents results that cast new light on important challenges in understanding quantum gravity.

A grand challenge in modern theoretical physics is to find a ‘unified theory’ that can describe all the laws of nature within a single framework – connecting Einstein’s general theory of relativity, which describes the universe on a large scale, and quantum mechanics, which describes our world at the atomic level. Such a theory of ‘quantum gravity’ would include both a macroscopic and microscopic description of nature.

Dude, what if everything around us was just … a hologram?

The thing is, it could be—and a University of Michigan physicist is using quantum computing and machine learning to better understand the idea, called holographic duality.

Holographic duality is a mathematical conjecture that connects theories of particles and their interactions with the theory of gravity. This conjecture suggests that the theory of gravity and the theory of particles are mathematically equivalent: what happens mathematically in the theory of gravity happens in the theory of particles, and vice versa.

How can Einstein’s theory of gravity be unified with quantum mechanics? It is a challenge that could give us deep insights into phenomena such as black holes and the birth of the universe. Now, a new article in Nature Communications, written by researchers from Chalmers University of Technology, Sweden, and MIT, U.S., presents results that cast new light on important challenges in understanding quantum gravity.

A grand challenge in modern theoretical physics is to find a “unified theory” that can describe all the laws of nature within a single framework—connecting Einstein’s general theory of relativity, which describes the universe on a large scale, and quantum mechanics, which describes our world at the . Such a theory of “” would include both a macroscopic and microscopic description of nature.

“We strive to understand the laws of nature and the language in which these are written is mathematics. When we seek answers to questions in physics, we are often led to new discoveries in mathematics too. This interaction is particularly prominent in the search for quantum gravity—where it is extremely difficult to perform experiments,” explains Daniel Persson, Professor at the Department of Mathematical Sciences at Chalmers university of technology.

The field of machine learning on quantum computers got a boost from new research removing a potential roadblock to the practical implementation of quantum neural networks. While theorists had previously believed an exponentially large training set would be required to train a quantum neural network, the quantum No-Free-Lunch theorem developed by Los Alamos National Laboratory shows that quantum entanglement eliminates this exponential overhead.

“Our work proves that both and big entanglement are valuable in quantum machine learning. Even better, entanglement leads to scalability, which solves the roadblock of exponentially increasing the size of the data in order to learn it,” said Andrew Sornborger, a computer scientist at Los Alamos and a coauthor of the paper published Feb. 18 in Physical Review Letters. “The theorem gives us hope that quantum neural networks are on track towards the goal of quantum speed-up, where eventually they will outperform their counterparts on classical computers.”

The classical No-Free-Lunch theorem states that any machine-learning algorithm is as good as, but no better than, any other when their performance is averaged over all possible functions connecting the data to their labels. A direct consequence of this theorem that showcases the power of data in classical machine learning is that the more data one has, the better the average performance. Thus, data is the currency in machine learning that ultimately limits performance.