Toggle light / dark theme

Before delving into the prospects of the Fifth Industrial Revolution, let’s reflect on the legacy of its predecessor. The Fourth Industrial Revolution, characterised by the fusion of digital, physical, and biological systems, has already transformed the way we live and work. It brought us AI, blockchain, the Internet of Things, and more. However, it also raised concerns about automation’s impact on employment and privacy, leaving us with a mixed legacy.

The promise of the Fifth Industrial Revolution.

The Fifth Industrial Revolution represents a quantum leap forward. At its core, it combines AI, advanced biotechnology, nanotechnology, and quantum computing to usher in a new era of possibilities. One of its most compelling promises is the extension of human life. With breakthroughs in genetic engineering, regenerative medicine, and AI-driven healthcare, we are inching closer to not just treating diseases but preventing them altogether. It’s a vision where aging is not an inevitability, but a challenge to overcome.

A new study in Physical Review Letters illuminates the intricacies of energy exchanges within bipartite quantum systems, offering profound insights into quantum coherence, pure dephasing effects, and the potential impact on future quantum technologies.

In quantum systems, the behavior of particles and are governed by probability distributions and wave functions, adding layers of complexity to the understanding of energy exchanges.

The exploration of energy exchanges in quantum systems inherently involves tackling the complexities arising from and the scales at which quantum systems operate, introducing sensitivity.

A study led by the University of Oxford has used the power of machine learning to overcome a key challenge affecting quantum devices. For the first time, the findings reveal a way to close the “reality gap”: the difference between predicted and observed behavior from quantum devices. The results have been published in Physical Review X.

Quantum computing could supercharge a wealth of applications, from climate modeling and financial forecasting to drug discovery and artificial intelligence. But this will require effective ways to scale and combine individual (also called qubits). A major barrier against this is inherent variability, where even apparently identical units exhibit different behaviors.

Functional variability is presumed to be caused by nanoscale imperfections in the materials from which quantum devices are made. Since there is no way to measure these directly, this internal disorder cannot be captured in simulations, leading to the gap in predicted and observed outcomes.

Although chaos theory can solve nearly anything that is unknown I basically think that in an infinite universe as made real from the infinite microchip that uses superfluid processing power is the real answer and we are off by factor of infinite parameters still.


When we look at scientific progress, especially in physics, it can seem like all the great discoveries lie behind us. Since the revolutions of Einstein’s theory of relativity and quantum mechanics, physicists have been struggling to find a way to make them fit together with little to no success. Tim Palmer argues that the answer to this stalemate lies in chaos theory.

Revisiting a book by John Horgan, science communicator and theoretical physicist Sabine Hossenfelder recently asked on her YouTube channel whether we are facing the end of science. It might seem like a rhetorical question — it’s not possible for science to really end — but she concludes that we are in dire need of some new paradigms in physics, and seemingly unable to arrive at them. We are yet to solve the deep ongoing mysteries of the dark universe and still haven’t convincingly synthesised quantum and gravitational physics. She suggests that ideas from chaos theory might hold some of the answers, and therefore the ability to rejuvenate science. I think she’s right.

The first functional semiconductor made from graphene has been created at the Georgia Institute of Technology. This could enable smaller and faster electronic devices and may have applications for quantum computing.

Credit: Georgia Institute of Technology.

Semiconductors, which are materials that conduct electricity under specific conditions, are foundational components of electronic devices like the chips in your computer, laptop, and smartphone. For many decades, their architecture has been getting smaller and more compact – a trend known as Moore’s Law. This has enabled gigantic leaps in a vast range of technologies, from general computing speeds and video game graphics, to the resolution of medical scans and the sensitivity of astronomical observatories.

For the first time, researchers have demonstrated the remarkable ability to perturb pairs of spatially separated yet interconnected quantum entangled particles without altering their shared properties.

The team includes researchers from the Structured Light Laboratory (School of Physics) at the University of the Witwatersrand in South Africa, led by Professor Andrew Forbes, in collaboration with string theorist Robert de Mello Koch from Huzhou University in China (previously from Wits University).

“We achieved this experimental milestone by entangling two identical photons and customizing their shared wave-function in such a way that their topology or structure becomes apparent only when the photons are treated as a unified entity,” explains lead author, Pedro Ornelas, an MSc student in the structured light laboratory.

In quantum physics, the enigmatic dance between interactions and disorder unfolds in the intricate phenomenon known as many-body localization.


Quantum many-body systems may not thermalize due to the phenomenon of many-body localisation. Its theoretical underpinning is given by observables, the l-bits, which could not as of now be probed by experiments. The authors define experimentally relevant quantities to retrieve spatially resolved entanglement information, allowing to probe the l-bits.