Netherlands’ MavLab (TU Delft) finished the race faster than the human-controlled drone.
Category: robotics/AI – Page 3
An autonomous drone carrying water to help extinguish a wildfire in the Sierra Nevada might encounter swirling Santa Ana winds that threaten to push it off course. Rapidly adapting to these unknown disturbances inflight presents an enormous challenge for the drone’s flight control system.
To help such a drone stay on target, MIT researchers developed a new, machine learning-based adaptive control algorithm that could minimize its deviation from its intended trajectory in the face of unpredictable forces like gusty winds.
The study is published on the arXiv preprint server.
A new machine learning model shows that star-shaped brain cells may be responsible for the brain’s memory capacity, and someday, it could inspire advances in AI and Alzheimer’s research.
At Phobio, well-implemented AI hasn’t just made us faster—it’s made us sharper, more creative and more strategic. When routine tasks are streamlined, people have time to think deeply about customers, competition and innovation.
Closing Thoughts
AI isn’t coming to take your job. But someone who knows how to use it might.
The current conventional wisdom on deep neural networks (DNNs) is that, in most cases, simply scaling up a model’s parameters and adopting computationally intensive architectures will result in large performance improvements. Although this scaling strategy has proven successful in research labs, real-world industrial deployments introduce a number of complications, as developers often need to repeatedly train a DNN, transmit it to different devices, and ensure it can perform under various hardware constraints with minimal accuracy loss.
The research community has thus become increasingly interested in reducing such models’ storage size on devices while also improving their run-time. Explorations in this area have tended to follow one of two avenues: reducing model size via compression techniques, or using model pruning to reduce computation burdens.
In the new paper LilNetX: Lightweight Networks with EXtreme Model Compression and Structured Sparsification, a team from the University of Maryland and Google Research proposes a way to “bridge the gap” between the two approaches with LilNetX, an end-to-end trainable technique for neural networks that jointly optimizes model parameters for accuracy, model size on the disk, and computation on any given task.
A team of astronomers led by Michael Janssen (Radboud University, The Netherlands) has trained a neural network with millions of synthetic black hole data sets. Based on the network and data from the Event Horizon Telescope, they now predict, among other things, that the black hole at the center of our Milky Way is spinning at near top speed.
The astronomers have published their results and methodology in three papers in the journal Astronomy & Astrophysics.
In 2019, the Event Horizon Telescope Collaboration released the first image of a supermassive black hole at the center of the galaxy M87. In 2022, they presented an image of the black hole in our Milky Way, Sagittarius A*. However, the data behind the images still contained a wealth of hard-to-crack information. An international team of researchers trained a neural network to extract as much information as possible from the data.
NVIDIA/physicsnemo: Open-source deep-learning framework for building, training, and fine-tuning deep learning models using state-of-the-art Physics-ML methods
Posted in robotics/AI | Leave a Comment on NVIDIA/physicsnemo: Open-source deep-learning framework for building, training, and fine-tuning deep learning models using state-of-the-art Physics-ML methods
Open-source deep-learning framework for building, training, and fine-tuning deep learning models using state-of-the-art Physics-ML methods — NVIDIA/physicsnemo
A team of researchers at the University of California, Los Angeles (UCLA) has introduced a novel framework for designing and creating universal diffractive waveguides that can control the flow of light in highly specific and complex ways.
This new technology uses artificial intelligence (AI), specifically deep learning, to design a series of structured surfaces that guide light with high efficiency and can perform a wide range of functions that are challenging for conventional waveguides.
The work is published in the journal Nature Communications.