Toggle light / dark theme

Oak Ridge National Laboratory researchers have developed artificial intelligence software for powder bed 3D printers that assesses the quality of parts in real time, without the need for expensive characterization equipment.

The software, named Peregrine, supports the “digital thread” being developed at ORNL that collects and analyzes data through every step of the manufacturing process, from design to feedstock selection to the print build to .

“Capturing that information creates a digital ‘clone’ for each part, providing a trove of data from the raw material to the operational component,” said Vincent Paquit, who leads advanced manufacturing data analytics research as part of ORNL’s Imaging, Signals and Machine Learning group. “We then use that data to qualify the part and to inform future builds across multiple part geometries and with multiple materials, achieving new levels of automation and manufacturing quality assurance.”

A new space tug, only the second-ever to extend the life of older satellites, launched into orbit Saturday on a European rocket after weeks of delays due to weather and rocket checks.

An Arianespace Ariane 5 rocket hefted the Mission Extension Vehicle-2 (MEV-2) into space Sunday (July 31), putting the vehicle en route to an Intelsat satellite waiting for a boost into a higher orbit. Riding along on the rocket were two satellites for broadband communications.

A recent column in The Atlantic magazine, “The Most Overhyped Planet in the Galaxy” implored the research community to expand its robotic exploration horizons beyond Mars. But despite arguments for rethinking what the article’s author, Marina Koren, terms the U.S.’ scientific “Mars monopoly” in favor of some of our solar system’s less-visited outer moons, scientific understanding of the Red Planet is arguably still decades away.


Rest assured, Mars should and will remain a scientific Valhalla for decades to come.

The researchers fused machine learning from demonstration algorithms and more classical autonomous navigation systems. Rather than replacing a classical system altogether, APPLD learns how to tune the existing system to behave more like the human demonstration. This paradigm allows for the deployed system to retain all the benefits of classical navigation systems—such as optimality, explainability and safety—while also allowing the system to be flexible and adaptable to new environments, Warnell said.


In the future, a soldier and a game controller may be all that’s needed to teach robots how to outdrive humans.

At the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory and the University of Texas at Austin, researchers designed an algorithm that allows an autonomous ground to improve its existing systems by watching a human drive. The team tested its approach—called adaptive planner parameter learning from demonstration, or APPLD—on one of the Army’s experimental autonomous ground vehicles.

“Using approaches like APPLD, current soldiers in existing training facilities will be able to contribute to improvements in simply by operating their vehicles as normal,” said Army researcher Dr. Garrett Warnell. “Techniques like these will be an important contribution to the Army’s plans to design and field next-generation combat vehicles that are equipped to navigate autonomously in off-road deployment environments.”

Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an Artificial Intelligence (AI) system that recognizes hand gestures by combining skin-like electronics with computer vision.

The recognition of human by AI systems has been a valuable development over the last decade and has been adopted in high-precision surgical robots, health monitoring equipment and in .

AI recognition systems that were initially visual-only have been improved upon by integrating inputs from wearable sensors, an approach known as ‘data fusion’. The wearable sensors recreate the skin’s sensing ability, one of which is known as ‘somatosensory’.