Toggle light / dark theme

Researchers used deep reinforcement learning to steer atoms into a lattice shape, with a view to building new materials or nanodevices.

In a very cold vacuum chamber, single atoms of silver form a star-like . The precise formation is not accidental, and it wasn’t constructed directly by either. Researchers used a kind of artificial intelligence called learning to steer the atoms, each a fraction of a nanometer in size, into the lattice shape. The process is similar to moving marbles around a Chinese checkers board, but with very tiny tweezers grabbing and dragging each atom into place.

The main application for deep is in robotics, says postdoctoral researcher I-Ju Chen. “We’re also building robotic arms with deep learning, but for moving atoms,” she explains. “Reinforcement learning is successful in things like playing chess or video games, but we’ve applied it to solve at the nanoscale.”

This research could potentially lead to a better understanding of the galaxy and its many mysteries.

It’s a cosmic riddle: How can galaxies remain together when all the matter we observe isn’t enough to keep them intact? Scientists believe an invisible force must beat play, something so mysterious they named it “dark matter” because of its lack of visibility.

This mysterious presence accounts for nearly three times more than what we can observe — a startling 27% of all existence! The mysterious dark matter is a profound mystery to scientists, its existence making up nearly one-third of the universe’s energy and mass yet remaining elusive due to its ability to avoid detection.


IStock / agsandrew.

After recombining the superposed photons by sending them through another crystal, the team measured the photon polarization across a number of repeated experiments. They found a quantum interference pattern, a pattern of light and dark stripes that could exist only if the photon had been split and was moving in both time directions.

“The superposition of processes we realized is more akin to an object spinning clockwise and counter-clockwise at the same time,” Strömberg said. The researchers created their time-flipped photon out of intellectual curiosity, but follow-up experiments showed that time flips can be paired with reversible logic gates to enable simultaneous computation in either direction, thus opening the way for quantum processors with greatly enhanced processing power.

Theoretical possibilities also sprout from the work. A future theory of quantum gravity, which would unite general relativity and quantum mechanics, should include particles of mixed time orientations like the one in this experiment, and could enable the researchers to peer into some of the universe’s most mysterious phenomena.

The concept of ‘anti-realism’ is widely seen as a fact of life for many physicists studying the mysterious effects of quantum mechanics. However, it also seems to contradict the assumptions of many other fields of research. In his research, Dr William Sulis at McMaster University in Canada explores the issue from a new perspective, by using a novel mathematical toolset named the ‘process algebra model’. In suggesting that reality itself is generated by interacting processes more fundamental than quantum particles, his theories could improve researchers’ understanding of fundamental processes in a wide variety of fields.

The concept of ‘locality’ states that objects and processes can only be influenced by other objects and processes in their immediate surroundings. It is a fundamental aspect of many fields of research and underpins all of the most complex systems we observe in nature, including living organisms. “Biologists and psychologists have known for centuries that the physical world is dominated by processes which are characterized by factors including transformation, interdependence, and information”, Dr Sulis explains. “Organisms are born, develop, continually exchange physical components and information with their environment, and eventually die.”

Beyond biology, the principle of locality also extends to Einstein’s theory of special relativity. Since the speed of light sets a fundamental speed limit on all processes in the universe, the theory states that no process can occur if it has not been triggered by another event in its past, at a close enough distance for light to travel between them within the time separating them. In general, these theories are unified by a concept which physicists call ‘realism’. Yet despite this seemingly intuitive rule, physicists have increasingly come to accept the idea that it doesn’t present a full description of how all processes unfold.

Researchers at Tohoku University, the University of Messina, and the University of California, Santa Barbara (UCSB) have developed a scaled-up version of a probabilistic computer (p-computer) with stochastic spintronic devices that is suitable for hard computational problems like combinatorial optimization and machine learning.

Moore’s law predicts that computers get faster every two years because of the evolution of semiconductor chips. While this is what has historically happened, the continued evolution is starting to lag. The revolutions in machine learning and means much higher computational ability is required. Quantum computing is one way of meeting these challenges, but significant hurdles to the practical realization of scalable quantum computers remain.

A p-computer harnesses naturally stochastic building blocks called probabilistic bits (p-bits). Unlike bits in traditional computers, p-bits oscillate between states. A p-computer can operate at room-temperature and acts as a domain-specific computer for a wide variety of applications in machine learning and artificial intelligence. Just like quantum computers try to solve inherently quantum problems in , p-computers attempt to tackle probabilistic algorithms, widely used for complicated computational problems in combinatorial optimization and sampling.

In 1916, Einstein finished his Theory of General Relativity, which describes how gravitational forces alter the curvature of spacetime. Among other things, this theory predicted that the Universe is expanding, which was confirmed by the observations of Edwin Hubble in 1929. Since then, astronomers have looked farther into space (and hence, back in time) to measure how fast the Universe is expanding – aka. the Hubble Constant. These measurements have become increasingly accurate thanks to the discovery of the Cosmic Microwave Background (CMB) and observatories like the Hubble Space Telescope.

Astronomers have traditionally done this in two ways: directly measuring it locally (using variable stars and supernovae) and indirectly based on redshift measurements of the CMB and cosmological models. Unfortunately, these two methods have produced different values over the past decade. As a result, astronomers have been looking for a possible solution to this problem, known as the “Hubble Tension.” According to a new paper by a team of astrophysicists, the existence of “Early Dark Energy” may be the solution cosmologists have been looking for.

The study was conducted by Marc Kamionkowski, the William R. Kenan, a junior professor of physics and astronomy at Johns Hopkins University (JHU), and Adam G. Riess – an astrophysicist and Bloomberg Distinguished Professor at JHU and the Space Telescope Science Institute (STScI). Their paper, titled “The Hubble Tension and Early Dark Energy,” is being reviewed for publication in the Annual Review of Nuclear and Particle Science (ARNP). As they explain in their paper, there are two methods for measuring cosmic expansion.

Establishing a moon base will be critical for the U.S. in the new space race and building safe and cost-effective landing pads for spacecraft to touch down there will be key.

These pads will have to stop and particles from sandblasting everything around them at more than 10,000 miles per hour as a rocket takes off or lands since there is no air to slow the rocket plume down.

However, how to build these landing pads is not so clear, as hauling materials and heavy equipment more than 230,000 miles into space quickly becomes cost prohibitive.

Dark matter makes up about 27% of the matter and energy budget in the universe, but scientists do not know much about it. They do know that it is cold, meaning that the particles that make up dark matter are slow-moving. It is also difficult to detect dark matter directly because it does not interact with light. However, scientists at the U.S. Department of Energy’s Fermi National Accelerator Laboratory (Fermilab) have discovered a way to use quantum computers to look for dark matter.

Aaron Chou, a senior scientist at Fermilab, works on detecting dark matter through quantum science. As part of DOE’s Office of High Energy Physics QuantISED program, he has developed a way to use qubits, the main component of quantum computing.

Performing computation using quantum-mechanical phenomena such as superposition and entanglement.

Stephen Wolfram is at his jovial peak in this technical interview regarding the Wolfram Physics project (theory of everything).
Sponsors: https://brilliant.org/TOE for 20% off. http://algo.com for supply chain AI.

Link to the Wolfram project: https://www.wolframphysics.org/

Patreon: https://patreon.com/curtjaimungal.
Crypto: https://tinyurl.com/cryptoTOE
PayPal: https://tinyurl.com/paypalTOE
Twitter: https://twitter.com/TOEwithCurt.
Discord Invite: https://discord.com/invite/kBcnfNVwqs.
iTunes: https://podcasts.apple.com/ca/podcast/better-left-unsaid-wit…1521758802
Pandora: https://pdora.co/33b9lfP
Spotify: https://open.spotify.com/show/4gL14b92xAErofYQA7bU4e.
Subreddit r/TheoriesOfEverything: https://reddit.com/r/theoriesofeverything.
Merch: https://tinyurl.com/TOEmerch.

TIMESTAMPS:

All proton-proton data collected by the CMS experiment during LHC Run-1 (2010−2012) are now available through the CERN Open Data Portal. Today’s release of 491 TB of collision data collected during 2012 culminates the process that started in 2014 with the very first release of research-grade open data in experimental particle physics. Completing the delivery of Run-1 data within 10 years after data taking reaffirms the CMS collaboration’s commitment to its open data policy.

The newly released data consist of 42 collision datasets from CMS data taken in early and late 2012 and count an additional 8.2 fb-1 of integrated luminosity for anyone to study. Related data assets, such as luminosity information and validated data filters, have been updated to cover the newly released data.

To foster reusability, physics analysis code examples to extract physics objects from these data are now included as CERN Open Data Portal records. This software has been successfully used to demonstrate the intricacies of the experimental particle data in the CMS Open Data workshop during the last three years. In addition, the CMS Open Data guide covers details of accessing physics objects using this software, giving open data users the possibility to expand on this example code for studies of their own interest.