Toggle light / dark theme

A recent column in The Atlantic magazine, “The Most Overhyped Planet in the Galaxy” implored the research community to expand its robotic exploration horizons beyond Mars. But despite arguments for rethinking what the article’s author, Marina Koren, terms the U.S.’ scientific “Mars monopoly” in favor of some of our solar system’s less-visited outer moons, scientific understanding of the Red Planet is arguably still decades away.


Rest assured, Mars should and will remain a scientific Valhalla for decades to come.

The researchers fused machine learning from demonstration algorithms and more classical autonomous navigation systems. Rather than replacing a classical system altogether, APPLD learns how to tune the existing system to behave more like the human demonstration. This paradigm allows for the deployed system to retain all the benefits of classical navigation systems—such as optimality, explainability and safety—while also allowing the system to be flexible and adaptable to new environments, Warnell said.


In the future, a soldier and a game controller may be all that’s needed to teach robots how to outdrive humans.

At the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory and the University of Texas at Austin, researchers designed an algorithm that allows an autonomous ground to improve its existing systems by watching a human drive. The team tested its approach—called adaptive planner parameter learning from demonstration, or APPLD—on one of the Army’s experimental autonomous ground vehicles.

“Using approaches like APPLD, current soldiers in existing training facilities will be able to contribute to improvements in simply by operating their vehicles as normal,” said Army researcher Dr. Garrett Warnell. “Techniques like these will be an important contribution to the Army’s plans to design and field next-generation combat vehicles that are equipped to navigate autonomously in off-road deployment environments.”

Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an Artificial Intelligence (AI) system that recognizes hand gestures by combining skin-like electronics with computer vision.

The recognition of human by AI systems has been a valuable development over the last decade and has been adopted in high-precision surgical robots, health monitoring equipment and in .

AI recognition systems that were initially visual-only have been improved upon by integrating inputs from wearable sensors, an approach known as ‘data fusion’. The wearable sensors recreate the skin’s sensing ability, one of which is known as ‘somatosensory’.

EPFL spin-off SEED Biosciences has developed a pipetting robot that can dispense individual cells one by one. Their innovation allows for enhanced reliability and traceability, and can save life-science researchers time and money.

The engineers at SEED Biosciences, an EPFL spin-off, have come up with a unique pipetting robot that can isolate single with the push of a button—without damaging the cells. Their device also records the cells’ electrical signature so that they can be traced. While this innovation may seem trivial, it can save researchers several weeks of precious time and speed up development work in pharmaceuticals, cancer treatments and personalized medicine. The company began marketing its device this year.

Biometrics may be the best way to protect society against the threat of deepfakes, but new solutions are being proposed by the Content Authority Initiative and the AI Foundation.

Deepfakes are the most serious criminal threat posed by artificial intelligence, according to a new report funded by the Dawes Centre for Future Crime at the University College London (UCL), among a list of the top 20 worries for criminal facilitation in the next 15 years.

The study is published in the journal Crime Science, and ranks the 20 AI-enabled crimes based on the harm they could cause.