Toggle light / dark theme

An autonomous drone for search and rescue in forests using optical sectioning algorithm

A team of researchers working at Johannes Kepler University has developed an autonomous drone with a new type of technology to improve search-and-rescue efforts. In their paper published in the journal Science Robotics, the group describes their drone modifications. Andreas Birk with Jacobs University Bremen has published a Focus piece in the same journal issue outlining the work by the team in Austria.

Finding people lost (or hiding) in the forest is difficult because of the tree cover. People in planes and helicopters have difficulty seeing through the canopy to the ground below, where people might be walking or even laying down. The same problem exists for thermal applications—heat sensors cannot pick up readings adequately through the canopy. Efforts have been made to add drones to search-and–, but they suffer from the same problems because they are remotely controlled by pilots using them to search the ground below. In this new effort, the researchers have added new technology that both helps to see through the tree canopy and to highlight people that might be under it.

The new technology is based on what the researchers describe as an airborne optical sectioning algorithm—it uses the power of a computer to defocus occluding objects such as the tops of . The second part of the new device uses thermal imaging to highlight the heat emitted from a warm body. A machine-learning application then determines if the heat signals are those of humans, animals or other sources. The new hardware was then affixed to a standard autonomous . The computer in the drone uses both locational positioning to determine where to search and cues from the AOS and thermal sensors. If a possible match is made, the drone automatically moves closer to a target to get a better look. If its sensors indicate a match, it signals the research team giving them the coordinates. In testing their newly outfitted drones over 17 field experiments, the researchers found it was able to locate 38 of 42 people hidden below tree canopies.

Dreams Come True: First-Ever Luxury Space Hotel Nears Launch

1. Bigelow must be pissed.

2. A giant ring like that would make a hell of a spacecraft to get around the solar system.

Lawrence Klaes shared a link to the group: Space Settlement Alliance.


The Orbital Assembly Corporation, a space construction firm run by NASA veterans, announced in a press statement today, June 24, that it has successfully demonstrated its technology for developing the world’s first space hotel.

The company carried out the demonstration during the official opening of its Fontana, California Facility, which will serve as its main headquarters as it aims to make luxury space holidays a reality before 2030.

Large-scale space constructions built by semi-autonomous robots

Researchers create an artificial tactile skin that mimics human tactile recognition processes

Over the past few decades, roboticists and computer scientists have developed artificial systems that replicate biological functions and human abilities in increasingly realistic ways. This includes artificial intelligence systems, as well as sensors that can capture various types of sensory data.

When trying to understand properties of objects and how to grasp them or handle them, humans often rely on their sense of touch. Artificial sensing systems that replicate human touch can thus be of great value, as they could enable the development of better performing and more responsive robots or prosthetic limbs.

Researchers at Sungkyunkwan University and Hanyang University in South Korea have recently created an artificial tactile sensing system that mimics the way in which humans recognize objects in their surroundings via their sense of touch. This system, presented in a paper published in Nature Electronics, uses to capture data associated with the tactile properties of objects.

New algorithm helps autonomous vehicles find themselves, summer or winter

Without GPS, autonomous systems get lost easily. Now a new algorithm developed at Caltech allows autonomous systems to recognize where they are simply by looking at the terrain around them—and for the first time, the technology works regardless of seasonal changes to that terrain.

Details about the process were published on June 23 in the journal Science Robotics.

The general process, known as visual terrain-relative navigation (VTRN), was first developed in the 1960s. By comparing nearby terrain to high-resolution satellite images, can locate themselves.

Introducing the NVIDIA Canvas App | NVIDIA Studio

Harness the power of AI to quickly turn simple brushstrokes into realistic landscape images for backgrounds, concept exploration, or creative inspiration. 🖌️

The NVIDIA Canvas app lets you create as quickly as you can imagine.

NVIDIA GPUs accelerate your work with incredible boosts in performance. Less time staring at pinwheels of death means bigger workloads, more features, and creating your work faster than ever. Welcome to NVIDIA Studio—and your new, more creative, process. RTX Studio laptops and desktops are purpose-built for creators, providing the best performance for video editing, 3D animation, graphic design, and photography.

For more information about NVIDIA Studio, visit: https://www.nvidia.com/studio.

CONNECT WITH US ON SOCIAL
Instagram: https://www.instagram.com/NVIDIACreators.
Twitter: https://twitter.com/NVIDIACreators.
Facebook: https://www.facebook.com/NVIDIACreators

The Four Stages of Intelligent Matter That Will Bring Us Iron Man’s ‘Endgame’ Nanosuit

Imagine clothing that can warm or cool you, depending on how you’re feeling. Or artificial skin that responds to touch, temperature, and wicks away moisture automatically. Or cyborg hands controlled with DNA motors that can adjust based on signals from the outside world.

Welcome to the era of intelligent matter—an unconventional AI computing idea directly woven into the fabric of synthetic matter. Powered by brain-based computing, these materials can weave the skins of soft robots or form microswarms of drug-delivering nanobots, all while reserving power as they learn and adapt.

Sound like sci-fi? It gets weirder. The crux that’ll guide us towards intelligent matter, said Dr. W.H.P. Pernice at the University of Munster and colleagues, is a distributed “brain” across the material’s “body”— far more alien than the structure of our own minds.

Deep reinforcement learning will transform manufacturing as we know it

If you walk down the street shouting out the names of every object you see — garbage truck! bicyclist! sycamore tree! — most people would not conclude you are smart. But if you go through an obstacle course, and you show them how to navigate a series of challenges to get to the end unscathed, they would.

Most machine learning algorithms are shouting names in the street. They perform perceptive tasks that a person can do in under a second. But another kind of AI — deep reinforcement learning — is strategic. It learns how to take a series of actions in order to reach a goal. That’s powerful and smart — and it’s going to change a lot of industries.

Two industries on the cusp of AI transformations are manufacturing and supply chain. The ways we make and ship stuff are heavily dependent on groups of machines working together, and the efficiency and resiliency of those machines are the foundation of our economy and society. Without them, we can’t buy the basics we need to live and work.

Self-Propelling Targeted Magneto-Nanobots for Deep Tumor Penetration and pH-Responsive Intracellular Drug Delivery

Circa 2020


Self-propelling magnetic nanorobots capable of intrinsic-navigation in biological fluids with enhanced pharmacokinetics and deeper tissue penetration implicates promising strategy in targeted cancer therapy. Here, multi-component magnetic nanobot designed by chemically conjugating magnetic Fe3O4 nanoparticles (NPs), anti-epithelial cell adhesion molecule antibody (anti-EpCAM mAb) to multi-walled carbon nanotubes (CNT) loaded with an anticancer drug, doxorubicin hydrochloride (DOX) is reported. Autonomous propulsion of the nanobots and their external magnetic guidance is enabled by enriching Fe3O4 NPs with dual catalytic-magnetic functionality. The nanobots propel at high velocities even in complex biological fluids. In addition, the nanobots preferably release DOX in the intracellular lysosomal compartment of human colorectal carcinoma (HCT116) cells by the opening of Fe3O4 NP gate.

Space Development Agency to launch five satellites aboard SpaceX rideshare

The U.S. Space Development Agency has five satellites riding on SpaceX’s Transporter-2 mission scheduled to launch June 25.


WASHINGTON — The U.S. Space Development Agency has five satellites riding on SpaceX’s Transporter-2 rideshare mission scheduled to launch June 25.

“There’s nothing in the space business that gets your blood pumping like the idea of a launch, especially if you’ve got multiple satellites,” a senior Space Development Agency (SDA) official told reporters June 22. “We’re really excited about what’s going to happen.”

Transporter-2 is expected to carry as many as 88 small satellites from commercial and government customers to a sun synchronous polar orbit. SDA’s five payloads include two pairs of satellites to demonstrate laser communications links, and one to demonstrate how data can be processed and analyzed autonomously aboard a satellite.