Nvidia announced today that Isaac, its developer toolbox for supporting AI-powered robotics, will deepen support of the Robot Operating System (ROS). The announcement is being made this morning at ROS World 2,021 a conference for developers, engineers, and hobbyists who work on ROS, a popular open-source framework that helps developers build and reuse code used for robotics applications.
Nvidia, which is trying to assert its lead as a supplier of processors for AI applications, announced a host of “performance perception” technologies that would be part of what it will now call Isaac ROS. This includes computer vision and AI/ML functionality in ROS-based applications to support things like autonomous robots.
A new control system, demonstrated using MIT’s robotic mini cheetah, enables four-legged robots to jump across uneven terrain in real-time. A loping cheetah dashes across a rolling field, bounding over sudden gaps in the rugged terrain. The movement may look effortless, but getting a robot to move this way is an altogether different prospect.
In recent years, four-legged robots inspired by the movement of cheetahs and other animals have made great leaps forward, yet they still lag behind their mammalian counterparts when it comes to traveling across a landscape with rapid elevation changes.
“In those settings, you need to use vision in order to avoid failure. For example, stepping in a gap is difficult to avoid if you can’t see it. Although there are some existing methods for incorporating vision into legged locomotion, most of them aren’t really suitable for use with emerging agile robotic systems,” says Gabriel Margolis, a PhD student in the lab of Pulkit Agrawal, professor in the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT.
We have achieved strong, fast, power-dense, high-efficiency, biomimetic, soft, safe, clean, organic and affordable robotic technology. Dumbbell weights 7 kg (15,6 lbs) 0 forearm with hand only 1 kg (2,2 lbs).
This artificial muscles robotic arm is operated by water and consumes 200W at peak. We invent and produce portable power supply and our own electro-hydraulic mini valves to have complete controllability of speed contraction and compress the whole powering system (for a full body) inside humanlike robot torso.
At this moment our robotic arm is operated only by a half of artificial muscles when compared to a human body. Strongest finger-bending muscle still missing. Fingers are going to move from left to right but they don’t have muscles yet. Metacarpal and left-to-right wrist movement are also blocked. This version has a position sensor in each joint but they are yet to be software-implemented. We are going to add everything mentioned above in the next prototype.
The movement sequence was written and sent by simple commands to a hand. We wish to develop a platform for reinforcement learning purposes, prosthetic arms and ultimately a full humanoid robots to serve people for fun, as butlers, cleaners, chauffeurs, construction workers (also in space) and even achieve human immortallity by transplanting the brain into the machine.
If you want to see updates on the project, please share, like, comment and hit that subscribe button. You can also support us directly on Patreon: https://www.patreon.com/automatonrobotics.
Accelerate your digital transformation and discover how dynamic sensing unlocks flexible, reliable data capture for.
Turn to agile mobile robots like Spot to collect better data and optimize asset performance in industrial environments across manufacturing, power and utilities, oil and gas, mining, construction, and more.
It seems inevitable that sooner or later, the performance of autonomous drones will surpass the performance of even the best human pilots. Usually things in robotics that seem inevitable happen later as opposed to sooner, but drone technology seems to be the exception to this. We’ve seen an astonishing amount of progress over the past few years, even to the extent of sophisticated autonomy making it into the hands of consumers at an affordable price.
The cutting edge of drone research right now is putting drones with relatively simple onboard sensing and computing in situations that require fast and highly aggressive maneuvers. In a paper published yesterday in Science Robotics, roboticists from Davide Scaramuzza’s Robotics and Perception Group at the University of Zurich along with partners at Intel demonstrate a small, self-contained, fully autonomous drone that can aggressively fly through complex environments at speeds of up to 40kph.
When’s the last time you chirped, “Hey Google” (or Siri for that matter), and asked your phone for a recommendation for good sushi in the area, or perhaps asked what time sunset would be? Most folks these days perform these tasks on a regular basis on their phones, but you may not have realized there were multiple AI (Artificial Intelligence) engines involved in quickly delivering the results for your request.
In these examples, AI neural network models were used to process natural language recognition, and then also inferred what you were looking for, to deliver relevant search results from internet databases around the globe, but also targeting the most appropriate results based on your location and a number of other factors as well. These are just a couple of examples but, in short, AI or machine learning processing is a big requirement of smartphone experiences these days, from recommendation engines to translation, computational photography and more.
As such, benchmarking tools are now becoming more prevalent, in an effort to measure mobile platform performance. MLPerf is one such tool that nicely covers the gamut of AI workloads, and today Qualcomm is highlighting some fairly impressive results in a recent major update to the MLCommons database. MLCommons is an open consortium comprised of various chip manufacturers and OEMs with founding members like Intel, NVIDIA, Arm, AMD, Google, Qualcomm and many others. The consortium’s MLPerf benchmark measures AI workloads like image classification, natural language processing and object detection. And today Qualcomm has tabulated benchmark results from its Snapdragon 888+ Mobile Platform (a slightly goosed-up version of its Snapdragon 888) versus a myriad of competitive mobile chipsets from Samsung, MediaTek and even and Intel’s 11th Gen Core series laptop chips.
The effects of AI on society are not just limited to the workplace. Recently, there has been a lot of talk about how AI will affect our social interactions and how we create and experience social spaces.
AI-driven architecture for social spaces presents us with new opportunities as well as challenges. In his work, Babar Kasam Cazir explores the implications of how artificial intelligence could change socioeconomic dynamics — specifically in event spaces — through its ability to analyze patterns at scale.
In a world first, US scientists on Thursday piloted a camera-equipped ocean drone that looks like a robotic surfboard into a Category 4 hurricane barreling across the Atlantic Ocean.
Dramatic footage released by the National Oceanic and Atmospheric Administration showed the small craft battling 50-feet (15 meter) high waves and winds of over 120 mph (190 kph) inside Hurricane Sam.
The autonomous vehicle is called a “Saildrone” and was developed by a company with the same name.