Toggle light / dark theme

It’s not a stretch to say that stretchable sensors could change the way soft robots function and feel. In fact, they will be able to feel quite a lot.

Cornell researchers have created a fiber-optic sensor that combines low-cost LEDs and dyes, resulting in a stretchable “skin” that detects deformations such as pressure, bending and strain. This sensor could give soft robotic systems – and anyone using augmented reality technology – the ability to feel the same rich, tactile sensations that mammals depend on to navigate the natural world.

Charles Darwin’s landmark opus “On the Origin of the Species” ends with a beautiful summary of his theory of evolution: “There is a grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.” In fact, scientists now know that most species that have ever existed are extinct.

This has, on the whole, been roughly balanced by the origination of new ones over Earth’s history, with a few major temporary imbalances scientists call extinction events. Scientists have long believed that mass extinctions create productive periods of evolution, or “radiations,” a model called “creative destruction.” A new study led by scientists affiliated with the Earth-Life Science Institute (ELSI) at Tokyo Institute of Technology used machine learning to examine the co-occurrence of fossil species and found that radiations and extinctions are rarely connected, and thus mass extinctions likely rarely cause radiations of a comparable scale.

Creative destruction is central to classic concepts of evolution. It seems clear that there are periods in which many species suddenly disappear, and many new species suddenly appear. However, radiations of a comparable scale to the mass extinctions, which this study, therefore, calls the mass radiations, have received far less analysis than extinction events. This study compared the impacts of both extinction and radiation across the period for which fossils are available, the so-called Phanerozoic Eon. The Phanerozoic (from the Greek meaning “apparent life”), represents the most recent ~ 550-million-year period of Earth’s total ~4.5 billion-year history, and is significant to palaeontologists: Before this period, most of the organisms that existed were microbes that didn’t easily form fossils, so the prior evolutionary record is hard to observe.

Researchers at Hokkaido University and Amoeba Energy in Japan have, inspired by the efficient foraging behavior of a single-celled amoeba, developed an analog computer for finding a reliable and swift solution to the traveling salesman problem—a representative combinatorial optimization problem.

Many real-world application tasks such as planning and scheduling in logistics and automation are mathematically formulated as combinatorial optimization problems. Conventional digital computers, including supercomputers, are inadequate to solve these in practically permissible time as the number of candidate solutions they need to evaluate increases exponentially with the problem size—also known as combinatorial explosion. Thus new computers called Ising machines, including quantum annealers, have been actively developed in recent years. These machines, however, require complicated pre-processing to convert each task to the form they can handle and have a risk of presenting illegal solutions that do not meet some constraints and requests, resulting in major obstacles to the practical applications.

These obstacles can be avoided using the newly developed ‘electronic amoeba,’ an inspired by a single-celled amoeboid organism. The amoeba is known to maximize nutrient acquisition efficiently by deforming its body. It has shown to find an approximate solution to the (TSP), i.e., given a map of a certain number of cities, the problem is to find the shortest route for visiting each exactly once and returning to the starting city. This finding inspired Professor Seiya Kasai at Hokkaido University to mimic the dynamics of the amoeba electronically using an analog circuit, as described in the journal Scientific Reports. “The amoeba core searches for a solution under the electronic environment where resistance values at intersections of crossbars represent constraints and requests of the TSP,” says Kasai. Using the crossbars, the city layout can be easily altered by updating the resistance values without complicated pre-processing.

“” As sunlight hits the houses, bedroom windows adjust their opacity to allow the natural light to wake sleepy residents,” Terminus said on its website, which also highlights tranquil green spaces like rooftop gardens. “Once the light has filled the room, an AI virtual housekeeper named Titan selects your breakfast, matches your outfit with the weather, and presents a full schedule of your day.” The city, which includes offices, homes, public spaces and self-driving cars that move around under the ever watchful eye of AI, is due for completion in about three years, according to Terminus.”


Web Summit conference hears plans by Danish architecture firm BIG and Chinese tech company Terminus to build an AI-run city in Chongqing.

Today we’re joined by Melodie Yashar — Designer, Researcher, Technologist, co-founder of the firm Space Exploration Architecture (SEArch+), Senior Research Associate with San Jose State University Research Foundation at NASA Ames Research Center, and an Associate Researcher within the UC Davis Center for Human/Robotics/Vehicle Integration and Performance (HRVIP). She also teaches undergraduate and graduate design at Art Center College of Design and is a 2019–2020 Future Space Leaders Fellow.

Melodie’s current work focuses on the relationship of advanced software & hardware systems for spaceflight and maintains ongoing research interests in the design of augmented environments, human-machine interaction, human performance studies, and space technology development.

As an undergraduate Melodie studied at UC Berkeley and at Art Center, and she holds graduate degrees in architecture and human-computer interaction with an emphasis in robotics from Columbia University and Carnegie Mellon University, respectively.

She also served as a Visiting Professor at Pratt Institute, as a researcher within Carnegie Mellon’s Morphing Matter Lab and Design Director of Sonic Platforms.

A new robot created by researchers at Northwestern University looks and behaves like a tiny aquatic animal, and could serve a variety of functions, including moving things place to place, catalyzing chemical reactions, delivering therapeutics and much more. This new soft robot honestly looks a heck of a lot like a lemon peel, but it’s actually a material made up of 90% water for the soft exterior, with a nickel skeleton inside that can change its shape in response to outside magnetic fields.

These robots are very small — only around the size of a dime — but they’re able to perform a range of tasks, including walking at the same speed as an average human, and picking up and carrying things. They work by either taking in or expelling water through their soft components, and can respond to light and magnetic fields thanks to their precise molecular design. Essentially, their molecular structure is crafted such that when they’re hit by light, the molecules that make them up expel water, causing the robot’s “legs” to stiffen like muscles.

Northwestern University researchers have developed a first-of-its-kind life-like material that acts as a soft robot. It can walk at human speed, pick up and transport cargo to a new location, climb up hills and even break-dance to release a particle.

Nearly 90% water by weight, the centimeter-sized moves without complex hardware, hydraulics or electricity. Instead, it is activated by light and walks in the direction of an external rotating .

Resembling a four-legged octopus, the robot functions inside a water-filled tank, making it ideal for use in aquatic environments. The researchers imagine customizing the movements of miniature robots to help catalyze different chemical reactions and then pump out the valuable products. The robots also could be molecularly designed to recognize and actively remove unwanted particles in specific environments, or to use their mechanical movements and locomotion to precisely deliver bio-therapeutics or cells to specific tissues.

Gollum in “The Lord of the Rings,” Thanos in the “Avengers,” Snoke in “Star Wars,” the Na’vi in “Avatar”—we have all experienced the wonders of motion-capture, a cinema technique that tracks an actor’s movements and translates them into computer animation to create a moving, emoting—and maybe one day Oscar-winning—digital character.

But what many might not realize is that motion capture isn’t limited to the big screen, but extends into science. Behavioral scientists have been developing and using similar tools to study and analyze the posture and movement of animals under a variety of conditions. But motion-capture approaches also require that the subject wears a complex suit with markers that let the computer “know” where each part of the body is in three-dimensional space. That might be okay for a professional actor, but animals tend to resist dressing up.

To solve the problem, scientists have begun combining motion-capture with deep learning, a method that lets a computer essentially teach itself how to optimize performing a task, e.g., recognizing a specific “key-point” in videos. The idea is to teach the computer to track and even predict the movements or posture of an animal without the need for markers.