Originally published on Towards AI the World’s Leading AI and Technology News and Media Company. If you are building an AI-related product or service, we invite you to consider becoming an AI sponsor. At Towards AI, we help scale AI and technology startups. Let us help you unleash your technology to the masses.
Category: robotics/AI – Page 1165
For proper operation, drones usually use accelerometers to determine the direction of gravity. In a new study published in Nature on October 19, 2022, a team of scientists from Delft University of Technology, the CNRS and Aix-Marseille University has shown that drones can estimate the direction of gravity by combining visual detection of movement with a model of how they move. These results may explain how flying insects determine the direction of gravity and are a major step toward the creation of tiny autonomous drones.
While drones typically use accelerometers to estimate the direction of gravity, the way flying insects achieve this has been shrouded in mystery until now, as they have no specific sense of acceleration. In this study, a European team of scientists led by the Delft University of Technology in the Netherlands and involving a CNRS researcher has shown that drones can assess gravity using visual motion detection and motion modeling together.
To develop this new principle, scientists have investigated optical flow, that is, how an individual perceives movement relative to their environment. It is the visual movement that sweeps across our retina when we move. For example, when we are on a train, trees next to the tracks pass by faster than distant mountains. The optical flow alone is not enough for an insect to be able to know the direction of gravity.
A robot fish that filters microplastics has been brought to life after it won the University of Surrey’s public competition—The Natural Robotics Contest.
The robot fish design, which was designed by a student named Eleanor Mackintosh, was selected by an international panel of judges because it could be part of a solution to minimize plastic pollution in our waterways.
The competition, which ran in the summer of 2022, was open to anyone who had an idea for a bio-inspired robot, with the promise that the winner would be turned into a working prototype.
MELBOURNE, Australia — The Japanese Coast Guard has started operations with a newly delivered MQ-9B SeaGuardian drone, while more airborne early warning aircraft have arrived in the country by ship.
The UAV’s manufacturer, General Atomics Aeronautical Systems, said in a news release that the Coast Guard commenced flight operations with a SeaGuardian from the Japan Maritime Self-Defense Force Air Station Hachinohe on Oct. 19.
The American company said the high-altitude, long-endurance unmanned aircraft “will primarily perform Maritime Wide Area Search (MWAS) over the Sea of Japan and the Pacific Ocean. Other missions will include search and rescue, disaster response, and maritime law enforcement.”
Prof. Aleks Farseev is an entrepreneur, research professor, keynote speaker, and the CEO of SoMin.ai, a long-tail ad optimization platform.
Not too long ago, I was asked to present a tool to some of my clients. It was a simple prototype, where a person would type in a few things (i.e., advertising channel, product and occasion), and in turn, the machine would give a number of sample ads. When I clicked the button, in just a few seconds, the machine spat out several ads complete with images and text. The first comment was, “Wow, that was really fast.” What would take a person a few hours to do, this machine did in but a fraction. There were a lot of other interesting comments, some even pointing out that this machine was really creative. Then one person spoke out, a comment that put the room into an uncomfortable silence, “This thing is going to take my job.”
We are in a time of uncertainty. As AI applications become more visible and popular, many will start wondering how they will impact our society. There are the “doomsayers” who think AI will take over the world. Then there are the more “sane” people who think that AI will never be able to replicate humans. After all, how can a machine copy something so intricate and complex? But then again, day by day, the advancements in AI continue to surprise us, as if to challenge our very humanity.
Advancing Space For Humanity — Dr. Ezinne Uzo-Okoro, Ph.D. — Assistant Director for Space Policy, Office of Science and Technology Policy, The White House.
Dr. Ezinne Uzo-Okoro, Ph.D. is Assistant Director for Space Policy, Office of Science and Technology Policy, at the White House (https://www.whitehouse.gov/ostp/) where she focuses on determining civil and commercial space priorities for the President’s science advisor, and her portfolio includes a wide range of disciplines including Orbital Debris, On-orbit Servicing, Assembly, and Manufacturing (OSAM), Earth Observations, Space Weather, and Planetary Protection.
Previously, Dr. Uzo-Okoro built and managed over 60 spacecraft missions and programs in 17 years at NASA, in roles as an engineer, technical expert, manager and executive, in earth observations, planetary science, heliophysics, astrophysics, human exploration, and space communications, which represented $9.2B in total program value. Her last role was as a NASA Heliophysics program executive.
While automated manufacturing is ubiquitous today, it was once a nascent field birthed by inventors such as Oliver Evans, who is credited with creating the first fully automated industrial process, in flour mill he built and gradually automated in the late 1700s. The processes for creating automated structures or machines are still very top-down, requiring humans, factories, or robots to do the assembling and making.
However, the way nature does assembly is ubiquitously bottom-up; animals and plants are self-assembled at a cellular level, relying on proteins to self-fold into target geometries that encode all the different functions that keep us ticking. For a more bio-inspired, bottom-up approach to assembly, then, human-architected materials need to do better on their own. Making them scalable, selective, and reprogrammable in a way that could mimic nature’s versatility means some teething problems, though.
Now, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have attempted to get over these growing pains with a new method: introducing magnetically reprogrammable materials that they coat different parts with—like robotic cubes—to let them self-assemble. Key to their process is a way to make these magnetic programs highly selective about what they connect with, enabling robust self-assembly into specific shapes and chosen configurations.
Ask a smart home device for the weather forecast, and it takes several seconds for the device to respond. One reason this latency occurs is because connected devices don’t have enough memory or power to store and run the enormous machine-learning models needed for the device to understand what a user is asking of it. The model is stored in a data center that may be hundreds of miles away, where the answer is computed and sent to the device.
MIT researchers have created a new method for computing directly on these devices, which drastically reduces this latency. Their technique shifts the memory-intensive steps of running a machine-learning model to a central server where components of the model are encoded onto light waves.
The waves are transmitted to a connected device using fiber optics, which enables tons of data to be sent lightning-fast through a network. The receiver then employs a simple optical device that rapidly performs computations using the parts of a model carried by those light waves.
Researchers from the Hefei Institutes of Physical Science (HFIPS) of the Chinese Academy of Sciences (CAS) have proposed a new artificial intelligence framework for target detection that provides a new solution for fast and high-precision real-time online target detection.
Relevant results were published in Expert Systems with Applications.
In recent years, deep learning theory has driven the rapid development of artificial intelligence technology. Object detection technology based on deep learning theory is also successful in many industrial applications. Current research focuses on improving the speed or accuracy of target detection and fails to take efficiency and accuracy into account. How to achieve fast and accurate object detection has become an important challenge in the field of artificial intelligence.
The artificial intelligence speech translation system can decipher Hokkien, a spoken language.
Meta has created a new speech translator that can translate Hokkien, a predominantly oral language spoken in the diaspora of China and one of the national languages of Taiwan.
Meta.
The spoken language.