For over a century, physicists have grappled with one of the most profound questions in science: How do the rules of quantum mechanics, which govern the smallest particles, fit with the laws of general relativity, which describe the universe on the largest scales?
The optical lattice clock, one of the most precise timekeeping devices, is becoming a powerful tool used to tackle this great challenge. Within an optical lattice clock, atoms are trapped in a “lattice” potential formed by laser beams and are manipulated with precise control of quantum coherence and interactions governed by quantum mechanics.
Simultaneously, according to Einstein’s laws of general relativity, time moves slower in stronger gravitational fields. This effect, known as gravitational redshift, leads to a tiny shift of atoms’ internal energy levels depending on their position in gravitational fields, causing their “ticking”—the oscillations that define time in optical lattice clocks—to change.
Many physicists and engineers have recently been trying to demonstrate the potential of quantum computers for tackling some problems that are particularly demanding and are difficult to solve for classical computers. A task that has been found to be challenging for both quantum and classical computers is finding the ground state (i.e., lowest possible energy state) of systems with multiple interacting quantum particles, called quantum many-body systems.
When one of these systems is placed in a thermal bath (i.e., an environment with a fixed temperature that interacts with the systems), it is known to cool down without always reaching its ground state. In some instances, a quantum system can get trapped at a so-called local minimum; a state in which its energy is lower than other neighboring states but not at the lowest possible level.
Researchers at California Institute of Technology and the AWS Center for Quantum Computing recently showed that while finding the local minimum for a system is difficult for classical computers, it could be far easier for quantum computers.
A small international team of nanotechnologists, engineers and physicists has developed a way to force laser light into becoming a supersolid. Their paper is published in the journal Nature. The editors at Nature have published a Research Briefing in the same issue summarizing the work.
Supersolids are entities that exist only in the quantum world, and, up until now, they have all been made using atoms. Prior research has shown that they have zero viscosity and are formed in crystal-like structures similar to the way atoms are arranged in salt crystals.
Because of their nature, supersolids have been created in extremely cold environments where the quantum effects can be seen. Notably, one of the team members on this new effort was part of the team that demonstrated more than a decade ago that light could become a fluid under the right set of circumstances.
“Nanobrain: The Making of an Artificial Brain from a Time Crystal” by Anirban Bandyopadhyay Book Link: https://amzn.to/3QQ4s44
The hosts discuss the book “Nanobrain” by Anirban Bandyopadhyay, an in-depth exploration of a nanobrain, an artificial brain built on the principles of fractal geometry and prime numbers, which claims to mimic human thought. It introduces Prime Phase Metric (PPM) which expands time crystals from natural events, and Geometric Musical Language (GML) which assigns fifteen primes to letters, similar to the English alphabet, and how the combination is consciousness-centric. The document contrasts this artificial brain’s functioning with traditional Turing machines and quantum computing, emphasizing its unique approach to data processing and decision-making and self-learning, and discusses its potential limitations and advantages. It investigates the use of time crystals, dynamic geometric shapes, and fractal mechanics, and the role that they may have in memory and learning in this new computer. Finally, the work explores the philosophical implications of such a machine, including its potential for achieving consciousness and the idea of recreating nature through a non-computational paradigm, such as modeling a new type of robot.
A new understanding of how an observer can change the disorder, or entropy, of a quantum object could help us probe how gravity interacts with the quantum realm.
In a new Physical Review Letters study, researchers propose an experimental approach that could finally determine whether gravity is fundamentally classical or quantum in nature.
The nature of gravity has puzzled physicists for decades. Gravity is one of the four fundamental forces, but it has resisted integration into the quantum framework, unlike the electromagnetic, strong, and weak nuclear forces.
Rather than directly tackling the challenging problem of constructing a complete quantum theory of gravity or trying to detect individual gravitons—the hypothetical mediator of gravity—the researchers take a different approach.
In a new development that could help redefine the future of technology, a team of physicists has uncovered a fundamental insight into the upper limit of superconducting temperature.
This research, accepted for publication in the Journal of Physics: Condensed Matter, suggests that room-temperature superconductivity —long considered the “holy grail” of condensed matter physics—may indeed be possible within the laws of our universe.
Superconductors, materials that can conduct electricity without resistance, have the potential to revolutionize energy transmission, medical imaging, and quantum computing. However, until now, they have only functioned at extremely low temperatures, making them impractical for widespread use. The race to find a superconductor that works at ambient conditions has been one of the most intense and elusive pursuits in modern science.
UC Santa Barbara researchers are working to move cold atom quantum experiments and applications from the laboratory tabletop to chip-based systems, opening new possibilities for sensing, precision timekeeping, quantum computing and fundamental science measurements.
“We’re at the tipping point,” said electrical and computer engineering professor Daniel Blumenthal.
In an invited article that was also selected for the cover of Optica Quantum, Blumenthal, along with graduate student researcher Andrei Isichenko and postdoctoral researcher Nitesh Chauhan, lays out the latest developments and future directions for trapping and cooling the atoms that are fundamental to these experiments—and that will bring them to devices that fit in the palm of your hand.
Scientists are tackling one of the biggest hurdles in quantum computing: errors caused by noise and interference. Their solution? A new chip called Ocelot that uses “cat qubits” — a special type of qubit that dramatically reduces errors. Traditional quantum systems require thousands of extra qubits for error correction, but this breakthrough could slash that number by 90%, bringing us closer to practical, powerful quantum computers m.
The AI revolution is happening faster than experts ever predicted — and we’ve hit the turning point.
The long-debated arrival of artificial general intelligence (AGI) may be closer than we think, with some experts suggesting we could reach the technological singularity within the next year.
A new analysis of nearly 8,600 expert predictions reveals shifting timelines, particularly since the rise of large language models (LLMs) like ChatGPT. While previous estimates placed AGI’s emergence around 2060, recent advancements have led many to revise their forecasts to as early as 2030.
Some industry leaders, however, believe AGI’s arrival is imminent, and with the rapid progression of computing power and potential breakthroughs in quantum computing, we may soon see machines capable of surpassing human intelligence.