Toggle light / dark theme

Physicists can explore tailored physical systems to rapidly solve challenging computational tasks by developing spin simulators, combinatorial optimization and focusing light through scattering media. In a new report on Science Advances, C. Tradonsky and a group of researchers in the Departments of Physics in Israel and India addressed the phase retrieval problem by reconstructing an object from its scattered intensity distribution. The experimental process addressed an existing problem in disciplines ranging from X-ray imaging to astrophysics that lack techniques to reconstruct an object of interest, where scientists typically use indirect iterative algorithms that are inherently slow.

In the new optical approach, Tradonsky et al conversely used a digital degenerate cavity laser (DDCL) mode to rapidly and efficiently reconstruct the object of interest. The experimental results suggested that the gain competition between the many lasing modes acted as a highly parallel computer to rapidly dissolve the phase retrieval problem. The approach applies to two-dimensional (2-D) objects with known compact support and complex-valued objects, to generalize imaging through scattering media, while accomplishing other challenging computational tasks.

To calculate the intensity distribution of light scattered far from an unknown object relatively easily, researchers can compute the source of the absolute value of an object’s Fourier transform. The reconstruction of an object from its scattered intensity distribution is, however, ill-posed, since phase information can be lost and diverse phase distributions in the work can result in different reconstructions. Scientists must therefore obtain prior information about an object’s shape, positivity, spatial symmetry or sparsity for more precise object reconstructions. Such examples are found in astronomy, short-pulse characterization studies, X-ray diffraction, radar detection, speech recognition and when imaging across turbid media. During the reconstruction of objects with a finite extent (compact support), researchers offer a unique solution to the phase retrieval problem, as long as they model the same scattered intensity at a sufficiently higher resolution.

The universe is kind of an impossible object. It has an inside but no outside; it’s a one-sided coin. This Möbius architecture presents a unique challenge for cosmologists, who find themselves in the awkward position of being stuck inside the very system they’re trying to comprehend.

It’s a situation that Lee Smolin has been thinking about for most of his career. A physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada, Smolin works at the knotty intersection of quantum mechanics, relativity and cosmology. Don’t let his soft voice and quiet demeanor fool you — he’s known as a rebellious thinker and has always followed his own path. In the 1960s Smolin dropped out of high school, played in a rock band called Ideoplastos, and published an underground newspaper. Wanting to build geodesic domes like R. Buckminster Fuller, Smolin taught himself advanced mathematics — the same kind of math, it turned out, that you need to play with Einstein’s equations of general relativity. The moment he realized this was the moment he became a physicist. He studied at Harvard University and took a position at the Institute for Advanced Study in Princeton, New Jersey, eventually becoming a founding faculty member at the Perimeter Institute.

“Perimeter,” in fact, is the perfect word to describe Smolin’s place near the boundary of mainstream physics. When most physicists dived headfirst into string theory, Smolin played a key role in working out the competing theory of loop quantum gravity. When most physicists said that the laws of physics are immutable, he said they evolve according to a kind of cosmic Darwinism. When most physicists said that time is an illusion, Smolin insisted that it’s real.

A new paper from researchers at the University of Chicago introduces a technique for compiling highly optimized quantum instructions that can be executed on near-term hardware. This technique is particularly well suited to a new class of variational quantum algorithms, which are promising candidates for demonstrating useful quantum speedups. The new work was enabled by uniting ideas across the stack, spanning quantum algorithms, machine learning, compilers, and device physics. The interdisciplinary research was carried out by members of the EPiQC (Enabling Practical-scale Quantum Computation) collaboration, an NSF Expedition in Computing.

Adapting to a New Paradigm for Quantum Algorithms

The original vision for dates to the early 1980s, when physicist Richard Feynman proposed performing molecular simulations using just thousands of noise-less qubits (quantum bits), a practically impossible task for traditional computers. Other algorithms developed in the 1990s and 2000s demonstrated that thousands of noise-less qubits would also offer dramatic speedups for problems such as database search, integer factoring, and matrix algebra. However, despite recent advances in quantum hardware, these algorithms are still decades away from scalable realizations, because current hardware features noisy qubits.

Something called the fast Fourier transform is running on your cell phone right now. The FFT, as it is known, is a signal-processing algorithm that you use more than you realize. It is, according to the title of one research paper, “an algorithm the whole family can use.”

Alexander Stoytchev – an associate professor of electrical and computer engineering at Iowa State University who’s also affiliated with the university’s Virtual Reality Applications Center, its Human Computer Interaction graduate program and the department of computer science – says the FFT algorithm and its inverse (known as the IFFT) are at the heart of signal processing.

And, as such, “These are algorithms that made the digital revolution possible,” he said.

Something called the fast Fourier transform is running on your cell phone right now. The FFT, as it is known, is a signal-processing algorithm that you use more than you realize. It is, according to the title of one research paper, “an algorithm the whole family can use.”

Alexander Stoytchev—an associate professor of electrical and computer engineering at Iowa State University who’s also affiliated with the university’s Virtual Reality Applications Center, its Human Computer Interaction graduate program and the department of computer science—says the FFT and its inverse (known as the IFFT) are at the heart of signal processing.

And, as such, “These are algorithms that made the digital revolution possible,” he said.

Sensitive synthetic skin enables robots to sense their own bodies and surroundings—a crucial capability if they are to be in close contact with people. Inspired by human skin, a team at the Technical University of Munich (TUM) has developed a system combining artificial skin with control algorithms and used it to create the first autonomous humanoid robot with full-body artificial skin.

The developed by Prof. Gordon Cheng and his team consists of hexagonal about the size of a two-euro coin (i.e. about one inch in diameter). Each is equipped with a microprocessor and sensors to detect contact, acceleration, proximity and temperature. Such artificial enables robots to perceive their surroundings in much greater detail and with more sensitivity. This not only helps them to move safely. It also makes them safer when operating near people and gives them the ability to anticipate and actively avoid accidents.

The themselves were developed around 10 years ago by Gordon Cheng, Professor of Cognitive Systems at TUM. But this invention only revealed its full potential when integrated into a sophisticated system as described in the latest issue of the journal Proceedings of the IEEE.

Two University of Hawaiʻi at Mānoa researchers have identified and corrected a subtle error that was made when applying Einstein’s equations to model the growth of the universe.

Physicists usually assume that a cosmologically large system, such as the universe, is insensitive to details of the small systems contained within it. Kevin Croker, a postdoctoral research fellow in the Department of Physics and Astronomy, and Joel Weiner, a faculty member in the Department of Mathematics, have shown that this assumption can fail for the compact objects that remain after the collapse and explosion of very large stars.

“For 80 years, we’ve generally operated under the assumption that the universe, in broad strokes, was not affected by the particular details of any small region,” said Croker. “It is now clear that general relativity can observably connect collapsed stars—regions the size of Honolulu—to the behavior of the universe as a whole, over a thousand billion billion times larger.”