Toggle light / dark theme

After the program was first revealed in 2019, the Air Force’s then-Assistant Secretary of the Air Force for Acquisition, Technology and Logistics Will Roper stated he wanted to see operational demonstrations within two years. The latest test flight of the Skyborg-equipped Avenger shows the service has clearly hit that benchmark.

The General Atomics Avenger was used in experiments with another autonomy system in 2020, developed as part of the Defense Advanced Research Projects Agency’s (DARPA) Collaborative Operations in Denied Environment (CODE) program that sought to develop drones that could demonstrate “collaborative autonomy,” or the ability to work cooperatively.

A team of researchers working at Johannes Kepler University has developed an autonomous drone with a new type of technology to improve search-and-rescue efforts. In their paper published in the journal Science Robotics, the group describes their drone modifications. Andreas Birk with Jacobs University Bremen has published a Focus piece in the same journal issue outlining the work by the team in Austria.

Finding people lost (or hiding) in the forest is difficult because of the tree cover. People in planes and helicopters have difficulty seeing through the canopy to the ground below, where people might be walking or even laying down. The same problem exists for thermal applications—heat sensors cannot pick up readings adequately through the canopy. Efforts have been made to add drones to search-and–, but they suffer from the same problems because they are remotely controlled by pilots using them to search the ground below. In this new effort, the researchers have added new technology that both helps to see through the tree canopy and to highlight people that might be under it.

The new technology is based on what the researchers describe as an airborne optical sectioning algorithm—it uses the power of a computer to defocus occluding objects such as the tops of . The second part of the new device uses thermal imaging to highlight the heat emitted from a warm body. A machine-learning application then determines if the heat signals are those of humans, animals or other sources. The new hardware was then affixed to a standard autonomous . The computer in the drone uses both locational positioning to determine where to search and cues from the AOS and thermal sensors. If a possible match is made, the drone automatically moves closer to a target to get a better look. If its sensors indicate a match, it signals the research team giving them the coordinates. In testing their newly outfitted drones over 17 field experiments, the researchers found it was able to locate 38 of 42 people hidden below tree canopies.

Chipmaker patches nine high-severity bugs in its Jetson SoC framework tied to the way it handles low-level cryptographic algorithms.

Flaws impacting millions of internet of things (IoT) devices running NVIDIA’s Jetson chips open the door for a variety of hacks, including denial-of-service (DoS) attacks or the siphoning of data.

NVIDIA released patches addressing nine high-severity vulnerabilities including eight additional bugs of less severity. The patches fix a wide swath of NVIDIA’s chipsets typically used for embedded computing systems, machine-learning applications and autonomous devices such as robots and drones.
Impacted products include Jetson chipset series; AGX Xavier, Xavier NX/TX1, Jetson TX2 (including Jetson TX2 NX), and Jetson Nano devices (including Jetson Nano 2GB) found in the NVIDIA JetPack software developers kit. The patches were delivered as part of NVIDIA’s June security bulletin, released Friday.