Toggle light / dark theme

Quantum Simulations of Curved Space

A heptagonal-lattice superconducting circuit, and the mathematics that describe it, provide tools for studying quantum mechanics in curved space.

According to John Wheeler’s summary of general relativity, “space-time tells matter how to move; matter tells space-time how to curve.” How this relationship plays out at the quantum scale is not known, because extending quantum experiments to curved space poses a challenge. In 2019, Alicia Kollár and colleagues at Princeton University met that challenge with a photonic circuit that represents the negatively curved space of an expanding universe [1]. Now, Igor Boettcher and colleagues at the University of Maryland, College Park, describe those experiments with a new theoretical framework [2]. Together, the studies offer a toolkit for studying quantum mechanics in curved space that could help answer fundamental questions about cosmology.

In a universe that expands at an accelerating rate, space curves away from itself at every point, producing a saddle-like, hyperbolic geometry. To project hyperbolic space onto a plane, Kollár’s team etched a centimeter-sized chip with superconducting resonators arranged in a lattice of heptagonal tiles. By decreasing the tile size toward the edge of the chip, the researchers reproduced a perplexing property of hyperbolic space: most of its points exist on its boundary. As a result, photons moving through the circuit behave like particles moving in negatively curved space.

Technique prevents errors in quantum computers

Even quantum computers make mistakes. Their computing ability is extraordinary, exceeding that of classical computers by far. This is because circuits in quantum computers are based on qubits that can represent not only zeroes or ones, but also superpositions of both states by using the principles of quantum mechanics. Despite their great potential, qubits are extremely fragile and prone to errors due to the interactions with the external environment.

To solve this crucial issue, an international research group developed and implemented a new protocol that protects fragile quantum information and corrects errors due to loss. This research group published the results of their study in Nature.

“Developing a fully functioning quantum processor still represents a great challenge for scientists across the world,” explains Davide Vodola who is one of the authors of the study as well as a researcher at the University of Bologna. “This research allowed us, for the first time, to implement a protocol that can detect and, at the same time, correct errors due to qubit loss. This ability could prove to be essential for the future development of large-scale quantum computers.”

Transistor-integrated cooling for a more powerful chip

Managing the heat generated in electronics is a huge problem, especially with the constant push to reduce the size and pack as many transistors as possible in the same chip. The whole problem is how to manage such high heat fluxes efficiently. Usually, electronic technologies, designed by electrical engineers, and cooling systems, designed by mechanical engineers, are done independently and separately. But now, EPFL researchers have quietly revolutionized the process by combining these two design steps into one: They’ve developed an integrated microfluidic cooling technology together with the electronics that can efficiently manage the large heat fluxes generated by transistors. Their research, which has been published in Nature, will lead to even more compact electronic devices and enable the integration of power converters, with several high-voltage devices, into a single chip.

The best of both worlds

In this ERC-funded project, Professor Elison Matioli, his doctoral student Remco Van Erp, and their team from the School of Engineering’s Power and Wide-band-gap Electronics Research Laboratory (POWERLAB), began working to bring about a real change in designing by conceiving the electronics and together, right from the beginning. The group sought to extract the very near the regions that heat up the most in the . “We wanted to combine skills in electrical and mechanical engineering in order to create a new kind of device,” says Van Erp.

An accurate simulation of high-pressure plasma for an economical helical fusion reactor

The research team of Assistant Professor Masahiko Sato and Professor Yasushi Todo of the National Institutes of Natural Sciences (NINS) National Institute for Fusion Science (NIFS) has succeeded using computer simulation in reproducing the high-pressure plasma confinement observed in the Large Helical Device (LHD). This result has enabled highly accurate predictions of plasma behavior aimed at realizing an economical helical fusion reactor.

In order to realize fusion energy, we must confine high pressure plasma using the magnetic field for a long duration. Although higher pressure plasma can be confined by a stronger magnetic field, it costs more to generate a stronger magnetic field using electromagnetic coils. Therefore, if the magnetic field strength is the same, a device that can confine higher pressure plasma is economically desirable. Because the LHD has succeeded in maintaining high-pressure plasma, there is great expectation in realizing a helical fusion reactor.

Design research for a future fusion reactor is performed based on computer simulations predicting the behavior of magnetically confined plasma. We require highly accurate simulations. To confirm the accuracy, the simulations are required to reproduce the experimental results obtained by the existing devices. However, the simulations had not reproduced the experimental results obtained by the LHD showing that high-pressure plasma is maintained. This has been a serious problem for the design research for an economical helical fusion reactor.

Math Riddle From Decades Ago Finally Solved After Being Lost And Found

A pair of Danish computer scientists have solved a longstanding mathematics puzzle that lay dormant for decades, after researchers failed to make substantial progress on it since the 1990s.

The abstract problem in question is part of what’s called graph theory, and specifically concerns the challenge of finding an algorithm to resolve the planarity of a dynamic graph. That might sound a bit daunting, so if your graph theory is a little rusty, there’s a much more fun and accessible way of thinking about the same inherent ideas.

Going as far back as 1913 – although the mathematical concepts can probably be traced back much further – a puzzle called the three utilities problem was published.

DARPA teams begin work on tiny brain implant to treat PTSD

Circa 2014 o,.o.


The Defense Advanced Research Projects Agency, or DARPA, has announced the start of a five-year, $26 million effort to develop brain implants that can treat mental disease with deep-brain stimulation.

The hope is to implant electrodes in different regions of the brain along with a tiny chip placed between the brain and the skull. The chip would monitor electrical signals in the brain and send data wirelessly back to scientists in order to gain a better understanding of psychological diseases like Post-Traumatic Stress Disorder (PTSD). The implant would also be used to trigger electrical impulses in order to relieve symptoms.

DARPA has chosen two teams that will pursue different approaches. A team from the University of California San Francisco will use direct recording, stimulation, and therapy to take advantage of the brain’s plasticity. Circuits that appear to drive pathology would be rewired, and eventually the patient could remove the implants.

/* */