Toggle light / dark theme

Robots today have been programmed to vacuum the floor or perform a preset dance, but there is still much work to be done before they can achieve their full potential. This mainly has something to do with how robots are unable to recognize what is in their environment at a deep level and therefore cannot function properly without being told all of these details by humans. For instance, while it may seem like backup programming for when bumping into an object that would help prevent unwanted collisions from happening again, this idea isn’t actually based on understanding anything about chairs because the robot doesn’t know exactly what one is!

Facebook AI team just released Droidlet, a new platform that makes it easier for anyone to build their smart robot. It’s an open-source project explicitly designed with hobbyists and researchers in mind so you can quickly prototype your AI algorithms without having to spend countless hours coding everything from scratch.

Droidlet is a platform for building embodied agents capable of recognizing, reacting to, and navigating the world. It simplifies integrating all kinds of state-of-the-art machine learning algorithms in these systems so that users can prototype new ideas faster than ever before!

DeepMind CEO and co-founder. “We believe this work represents the most significant contribution AI has made to advancing the state of scientific knowledge to date. And I think it’s a great illustration and example of the kind of benefits AI can bring to society. We’re just so excited to see what the community is going to do with this.” https://www.futuretimeline.net/images/socialmedia/


AlphaFold is an artificial intelligence (AI) program that uses deep learning to predict the 3D structure of proteins. Developed by DeepMind, a London-based subsidiary of Google, it made headlines in November 2020 when competing in the Critical Assessment of Structure Prediction (CASP). This worldwide challenge is held every two years by the scientific community and is the most well-known protein modelling benchmark. Participants must “blindly” predict the 3D structures of different proteins, and their computational methods are subsequently compared with real-world laboratory results.

The CASP challenge has been held since 1994 and uses a metric known as the Global Distance Test (GDT), ranging from 0 to 100. Winners in previous years had tended to hover around the 30 to 40 mark, with a score of 90 considered to be equivalent to an experimentally determined result. In 2018, however, the team at DeepMind achieved a median of 58.9 for the GDT and an overall score of 68.5 across all targets, by far the highest of any algorithm.

Then in 2020, version 2.0 of their AlphaFold program competed in the CASP, winning once again – this time with even greater accuracy. The AlphaFold 2.0 achieved a median of 92.4 across all targets, with its average margin of error comparable to the width of an atom (0.16 nanometres). Andrei Lupas, biologist at the Max Planck Institute in Germany who assessed the performances of each team in CASP, said of AlphaFold: “This will change medicine. It will change research. It will change bioengineering. It will change everything.”

The applications claimed Dabus, which is made up of artificial neural networks, invented an emergency warning light and a type of food container, among other inventions.

Several countries, including Australia, had rejected the applications, stating a human must be named the inventor. The decision by the Australian deputy commissioner of patents in February this year found that although “inventor” was not defined in the Patents Act when it was written in 1991 it would have been understood to mean natural persons – with machines being tools that could be used by inventors.

But in a federal court judgment on Friday, justice Jonathan Beach overturned the decision, and sent the matter back to the commission for reconsideration.

Every dad should do this. 😃


French dad and robotics engineer Jean-Louis Constanza has built a robotic suit for his 16-year-old son Oscar that allows him to walk.

Oscar, a wheelchair user, activates the suit by saying “Robot, stand up” and it then walks for him.

Jean-Louis co-founded the company that builds the suit, which can allow users to move upright for a few hours a day.

It is used in several hospitals, but it isn’t yet available for everyday use by individuals and has a price tag of around €150000 (about £127700).

Granted, it’s a little different for a robot, since they don’t have lungs or a heart. But they do have a “brain” (software), “muscles” (hardware), and “fuel” (a battery), and these all had to work together for Cassie to be able to run.

The brunt of the work fell to the brain—in this case, a machine learning algorithm developed by students at Oregon State University’s Dynamic Robotics Laboratory. Specifically, they used deep reinforcement learning, a method that mimics the way humans learn from experience by using a trial-and-error process guided by feedback and rewards. Over many repetitions, the algorithm uses this process to learn how to accomplish a set task. In this case, since it was trying to learn to run, it may have tried moving the robot’s legs varying distances or at distinct angles while keeping it upright.

Once Cassie got a good gait down, completing the 5K was as much a matter of battery life as running prowess. The robot covered the whole distance (a course circling around the university campus) on a single battery charge in just over 53 minutes, but that did include six and a half minutes of troubleshooting; the computer had to be reset after it overheated, as well as after Cassie fell during a high-speed turn. But hey, an overheated computer getting reset isn’t so different from a human runner pausing to douse their head and face with a cup of water to cool off, or chug some water to rehydrate.

Summary: Researchers have compiled a new, highly detailed 3D brain map that captures the shapes and activity of neurons in the visual neocortex of mice. The map is freely available for neuroscience researchers and artificial intelligence specialists to utilize.

Source: Allen Institute


Researchers from the University of Reading, in the UK, are using drones to give clouds an electrical charge, which could help increase rainfall in water-stressed regions.