Toggle light / dark theme

Autonomous robotic nanofabrication with reinforcement learning

The ability to handle single molecules as effectively as macroscopic building blocks would enable the construction of complex supramolecular structures inaccessible to self-assembly. The fundamental challenges obstructing this goal are the uncontrolled variability and poor observability of atomic-scale conformations. Here, we present a strategy to work around both obstacles and demonstrate autonomous robotic nanofabrication by manipulating single molecules. Our approach uses reinforcement learning (RL), which finds solution strategies even in the face of large uncertainty and sparse feedback. We demonstrate the potential of our RL approach by removing molecules autonomously with a scanning probe microscope from a supramolecular structure. Our RL agent reaches an excellent performance, enabling us to automate a task that previously had to be performed by a human. We anticipate that our work opens the way toward autonomous agents for the robotic construction of functional supramolecular structures with speed, precision, and perseverance beyond our current capabilities.

The swift development of quantum technologies could be further advanced if we managed to free ourselves from the imperatives of crystal growth and self-assembly and learned to fabricate custom-built metastable structures on atomic and molecular length scales routinely (17). Metastable structures, apart from being more abundant than stable ones, tend to offer attractive functionalities, because their constituent building blocks can be arranged more freely and in particular in desired functional relationships (7).

It is well established that single molecules can be manipulated and arranged using mechanical, optical, or magnetic actuators (8), such as the tips of scanning probe microscopes (SPMs) (912) or optical tweezers (13, 14). With all these types of actuators, a sequence of manipulation steps can be carried out to bring a system of molecular building blocks into a desired target state. The problem of creating custom-built structures from single molecules can therefore be cast as a challenge in robotics.

New Website Lets You Help NASA Find Alien Worlds

NASA just launched a new citizen science project — it wants the public’s help to find and identify brand new exoplanets.


Human Touch

This is the sort of work that technically could be automated with an algorithm trained to spot new worlds, Space.com reports. But it turns out that in this case, there’s no substitute for human judgment.

“Automated methods of processing TESS data sometimes fail to catch imposters that look like exoplanets,” Veselin Kostov, the NASA researcher leading the Planet Patrol project, said in a press release. “The human eye is extremely good at spotting such imposters, and we need citizen scientists to help us distinguish between the lookalikes and genuine planets.”

Congress Wants a ‘Manhattan Project’ for Military Artificial Intelligence

A new bipartisan #congressionalreport calls for the #DefenseDepartment to get a lot more serious about the race to acquire #artificialintelligence and #autonomouscapabilities, modeling efforts to become dominant in these spheres after the “Manhattan Project” initiative to test and develop nuclear weapons in the 1940s.

On Tuesday, the House Armed Services Committee released the results of a yearlong review, co-led by Reps. Seth Moulton, D-Mass., and Jim Banks, R-Ind., aimed at assessing #U.S. #militarycapabilities and preparedness to meet current threats. The 87-page #Future of Defense Task Force Report contains some expected findings — #China and #Russia are identified as the top security threats to the U.S. and modernization is described as an urgent need — but there are surprising points of emphasis.


A bipartisan congressional report calls for the DoD to get more serious about the race to acquire artificial intelligence and autonomous capabilities, modeling efforts to become dominant in these spheres after the “Manhattan Project” initiative to test and develop nuclear weapons in the 1940s.

This AI Generates Photos Using Only Text Captions as a Guide

Researchers at the Allen Institute for Artificial Intelligence (AI2) have created a machine learning algorithm that can produce images using only text captions as its guide. The results are somewhat terrifying… but if you can look past the nightmare fuel, this creation represents an important step forward in the study of AI and imaging.

Unlike some of the genuinely mind-blowing machine learning algorithms we’ve shared in the past—see here, here, and here —this creation is more of a proof-of-concept experiment. The idea was to take a well-established computer vision model that can caption photos based on what it “sees” in the image, and reverse it: producing an AI that can generate images from captions, instead of the other way around.

This is a fascinating area of study and, as MIT Technology Review points out, it shows in real terms how limited these computer vision algorithms really are. While even a small child can do both of these things readily—describe an image in words, or conjure a mental picture of an image based on those words—when the Allen Institute researchers tried to generate a photo from a text caption using a model called LXMERT, it generated nonsense in return.

NATO’s Autonomous Drone Delivery Experiment Works

autonomous drone delivery

DroneUp and NATO Allied Command Transformation performed an experiment to prove a new and innovative way of resupplying soldiers on the battlefield. The experiment proved that autonomous drone delivery works.

“DroneUp recently partnered with North Atlantic Treaty Organization Allied Command Transformation, Joint Force Development Directorate, Operational Experimentation branch in an experiment designed to determine if autonomous delivery of a specified payload to identified recipients under field conditions could be proven viable,” says a press release.

The experiment took place on September 21, 2020 in Lawrenceville, VA and included Pale Horse

autonomous drone delivery

Weapons Institute, Daniel Defense, Ultimate Training Munitions (UTM), and WeaponLogic. In summary, here’s how the autonomous drone delivery system test worked: soldiers running out of ammunition hit a button (which can be attached to their hat or clothing.) That button signals a drone to fly to that individual soldier and drop a payload – which can be unique to that individual. Then the drone returns home for the next mission.