Toggle light / dark theme

It’s hard to see more than a handful of stars from Princeton University, because the lights from New York City, Princeton, and Philadelphia prevent our sky from ever getting pitch black, but stargazers who get into more rural areas can see hundreds of naked-eye stars — and a few smudgy objects, too.

The biggest smudge is the Milky Way itself, the billions of stars that make up our spiral galaxy, which we see edge-on. The smaller smudges don’t mean that you need glasses, but that you’re seeing tightly packed groups of stars. One of the best-known of these “clouds” or “clusters” — groups of stars that travel together — is the Pleiades, also known as the Seven Sisters. Clusters are stellar nurseries where thousands of stars are born from clouds of gas and dust and then disperse across the Milky Way.

For centuries, scientists have speculated about whether these clusters always form tight clumps like the Pleiades, spread over only a few dozen lightyears.

A team of researchers working at Johannes Kepler University has developed an autonomous drone with a new type of technology to improve search-and-rescue efforts. In their paper published in the journal Science Robotics, the group describes their drone modifications. Andreas Birk with Jacobs University Bremen has published a Focus piece in the same journal issue outlining the work by the team in Austria.

Finding people lost (or hiding) in the forest is difficult because of the tree cover. People in planes and helicopters have difficulty seeing through the canopy to the ground below, where people might be walking or even laying down. The same problem exists for thermal applications—heat sensors cannot pick up readings adequately through the canopy. Efforts have been made to add drones to search-and–, but they suffer from the same problems because they are remotely controlled by pilots using them to search the ground below. In this new effort, the researchers have added new technology that both helps to see through the tree canopy and to highlight people that might be under it.

The new technology is based on what the researchers describe as an airborne optical sectioning algorithm—it uses the power of a computer to defocus occluding objects such as the tops of . The second part of the new device uses thermal imaging to highlight the heat emitted from a warm body. A machine-learning application then determines if the heat signals are those of humans, animals or other sources. The new hardware was then affixed to a standard autonomous . The computer in the drone uses both locational positioning to determine where to search and cues from the AOS and thermal sensors. If a possible match is made, the drone automatically moves closer to a target to get a better look. If its sensors indicate a match, it signals the research team giving them the coordinates. In testing their newly outfitted drones over 17 field experiments, the researchers found it was able to locate 38 of 42 people hidden below tree canopies.

1. Bigelow must be pissed.

2. A giant ring like that would make a hell of a spacecraft to get around the solar system.

Lawrence Klaes shared a link to the group: Space Settlement Alliance.


The Orbital Assembly Corporation, a space construction firm run by NASA veterans, announced in a press statement today, June 24, that it has successfully demonstrated its technology for developing the world’s first space hotel.

The company carried out the demonstration during the official opening of its Fontana, California Facility, which will serve as its main headquarters as it aims to make luxury space holidays a reality before 2030.

Large-scale space constructions built by semi-autonomous robots

Over the past few decades, roboticists and computer scientists have developed artificial systems that replicate biological functions and human abilities in increasingly realistic ways. This includes artificial intelligence systems, as well as sensors that can capture various types of sensory data.

When trying to understand properties of objects and how to grasp them or handle them, humans often rely on their sense of touch. Artificial sensing systems that replicate human touch can thus be of great value, as they could enable the development of better performing and more responsive robots or prosthetic limbs.

Researchers at Sungkyunkwan University and Hanyang University in South Korea have recently created an artificial tactile sensing system that mimics the way in which humans recognize objects in their surroundings via their sense of touch. This system, presented in a paper published in Nature Electronics, uses to capture data associated with the tactile properties of objects.

Without GPS, autonomous systems get lost easily. Now a new algorithm developed at Caltech allows autonomous systems to recognize where they are simply by looking at the terrain around them—and for the first time, the technology works regardless of seasonal changes to that terrain.

Details about the process were published on June 23 in the journal Science Robotics.

The general process, known as visual terrain-relative navigation (VTRN), was first developed in the 1960s. By comparing nearby terrain to high-resolution satellite images, can locate themselves.

Harness the power of AI to quickly turn simple brushstrokes into realistic landscape images for backgrounds, concept exploration, or creative inspiration. 🖌️

The NVIDIA Canvas app lets you create as quickly as you can imagine.

NVIDIA GPUs accelerate your work with incredible boosts in performance. Less time staring at pinwheels of death means bigger workloads, more features, and creating your work faster than ever. Welcome to NVIDIA Studio—and your new, more creative, process. RTX Studio laptops and desktops are purpose-built for creators, providing the best performance for video editing, 3D animation, graphic design, and photography.

For more information about NVIDIA Studio, visit: https://www.nvidia.com/studio.

CONNECT WITH US ON SOCIAL
Instagram: https://www.instagram.com/NVIDIACreators.
Twitter: https://twitter.com/NVIDIACreators.
Facebook: https://www.facebook.com/NVIDIACreators

Imagine clothing that can warm or cool you, depending on how you’re feeling. Or artificial skin that responds to touch, temperature, and wicks away moisture automatically. Or cyborg hands controlled with DNA motors that can adjust based on signals from the outside world.

Welcome to the era of intelligent matter—an unconventional AI computing idea directly woven into the fabric of synthetic matter. Powered by brain-based computing, these materials can weave the skins of soft robots or form microswarms of drug-delivering nanobots, all while reserving power as they learn and adapt.

Sound like sci-fi? It gets weirder. The crux that’ll guide us towards intelligent matter, said Dr. W.H.P. Pernice at the University of Munster and colleagues, is a distributed “brain” across the material’s “body”— far more alien than the structure of our own minds.

If you walk down the street shouting out the names of every object you see — garbage truck! bicyclist! sycamore tree! — most people would not conclude you are smart. But if you go through an obstacle course, and you show them how to navigate a series of challenges to get to the end unscathed, they would.

Most machine learning algorithms are shouting names in the street. They perform perceptive tasks that a person can do in under a second. But another kind of AI — deep reinforcement learning — is strategic. It learns how to take a series of actions in order to reach a goal. That’s powerful and smart — and it’s going to change a lot of industries.

Two industries on the cusp of AI transformations are manufacturing and supply chain. The ways we make and ship stuff are heavily dependent on groups of machines working together, and the efficiency and resiliency of those machines are the foundation of our economy and society. Without them, we can’t buy the basics we need to live and work.

Circa 2020


Self-propelling magnetic nanorobots capable of intrinsic-navigation in biological fluids with enhanced pharmacokinetics and deeper tissue penetration implicates promising strategy in targeted cancer therapy. Here, multi-component magnetic nanobot designed by chemically conjugating magnetic Fe3O4 nanoparticles (NPs), anti-epithelial cell adhesion molecule antibody (anti-EpCAM mAb) to multi-walled carbon nanotubes (CNT) loaded with an anticancer drug, doxorubicin hydrochloride (DOX) is reported. Autonomous propulsion of the nanobots and their external magnetic guidance is enabled by enriching Fe3O4 NPs with dual catalytic-magnetic functionality. The nanobots propel at high velocities even in complex biological fluids. In addition, the nanobots preferably release DOX in the intracellular lysosomal compartment of human colorectal carcinoma (HCT116) cells by the opening of Fe3O4 NP gate.