Toggle light / dark theme

With an internal global ocean twice the size of Earth’s oceans combined, Jupiter’s moon Europa carries the potential for conditions suitable for life. But the frigid temperatures and the nonstop pummeling of the surface from Jupiter’s radiation make it a tricky target to explore: Mission engineers and scientists must design a spacecraft hardy enough to withstand the radiation yet sensitive enough to gather the science needed to investigate Europa’s environment.

The Europa Clipper orbiter will swoop around Jupiter on an elliptical path, dipping close to the moon on each flyby to conduct detailed reconnaissance. The science includes gathering measurements of the internal ocean, mapping the surface composition and its geology, and hunting for plumes of water vapor that may be venting from the icy crust.

In recent years, robots have gained artificial vision, touch, and even smell. “Researchers have been giving robots human-like perception,” says MIT Associate Professor Fadel Adib. In a new paper, Adib’s team is pushing the technology a step further. “We’re trying to give robots superhuman perception,” he says.

The researchers have developed a that uses radio waves, which can pass through walls, to sense occluded objects. The robot, called RF-Grasp, combines this powerful sensing with more traditional computer vision to locate and grasp items that might otherwise be blocked from view. The advance could one day streamline e-commerce fulfillment in warehouses or help a machine pluck a screwdriver from a jumbled toolkit.

The research will be presented in May at the IEEE International Conference on Robotics and Automation. The paper’s lead author is Tara Boroushaki, a research assistant in the Signal Kinetics Group at the MIT Media Lab. Her MIT co-authors include Adib, who is the director of the Signal Kinetics Group; and Alberto Rodriguez, the Class of 1957 Associate Professor in the Department of Mechanical Engineering. Other co-authors include Junshan Leng, a research engineer at Harvard University, and Ian Clester, a Ph.D. student at Georgia Tech.

The researchers let the cell clusters assemble in the right proportions and then used micro-manipulation tools to move or eliminate cells — essentially poking and carving them into shapes like those recommended by the algorithm. The resulting cell clusters showed the predicted ability to move over a surface in a nonrandom way.

The team dubbed these structures xenobots. While the prefix was derived from the Latin name of the African clawed frogs (Xenopus laevis) that supplied the cells, it also seemed fitting because of its relation to xenos, the ancient Greek for “strange.” These were indeed strange living robots: tiny masterpieces of cell craft fashioned by human design. And they hinted at how cells might be persuaded to develop new collective goals and assume shapes totally unlike those that normally develop from an embryo.

But that only scratched the surface of the problem for Levin, who wanted to know what might happen if embryonic frog cells were “liberated” from the constraints of both an embryonic body and researchers’ manipulations. “If we give them the opportunity to re-envision multicellularity,” Levin said, then his question was, “What is it that they will build?”

WASHINGTON — NASA’s OSIRIS-REx spacecraft will make one final close approach to the asteroid it collected samples from next week before heading back to Earth.

On April 7, the spacecraft will pass 3.7 kilometers above the location on the asteroid Bennu called Nightingale where, in October, the spacecraft briefly touched down and collected as much as several hundred grams of material, now stored in the spacecraft.

Immediately after that sample collection maneuver, the mission had no plans to return to the vicinity of Bennu. However, NASA decided to make a final pass over the touchdown site to see what changes the sampling made to the Nightingale region, like the creation of a crater.

AI plays an important role across our apps — from enabling AR effects, to helping keep bad content off our platforms and better supporting our communities through our COVID-19 Community Help hub. As AI-powered services become more present in everyday life, it’s becoming even more important to understand how AI systems may affect people around the world and how we can strive to ensure the best possible outcomes for everyone.

Several years ago, we created an interdisciplinary Responsible AI (RAI) team to help advance the emerging field of Responsible AI and spread the impact of such work throughout Facebook. The Fairness team is part of RAI, and works with product teams across the company to foster informed, context-specific decisions about how to measure and define fairness in AI-powered products.

In the future, Tesla’s Autopilot and Full Self-Driving suite are expected to handle challenging circumstances on the road with ease. These involve inner-city driving, which includes factors like pedestrians walking about, motorcyclists driving around cars, and other potential edge cases. When Autopilot is able to handle these cases confidently, the company could roll out ambitious projects such as Elon Musk’s Robotaxi Network.

Tesla’s FSD Beta, at least based on videos of the system in action, seems to be designed for maximum safety. Members of the first batch of testers for the FSD Beta have shared clips of the advanced driver-assist system handling even challenging inner-city streets in places such as San Francisco with caution. But even these difficult roads pale in comparison to the traffic situation in other parts of the world.

In Southeast Asian countries such as Vietnam, for example, traffic tends to be very challenging, to the point where even experienced human drivers could experience anxiety when navigating through inner-city roads. The same is true for other countries like India or the Philippines, where road rules are loosely followed. In places such as these, Autopilot still has some ways to go, as seen in a recently shared video from a Tesla Model X owner.

Summary: The BrainGate brain-machine interface is able to transmit signals from a single neuron resolution with full broadband fidelity without physically tethering the user to a decoding system.

Source: Brown University.

Brain-computer interfaces (BCIs) are an emerging assistive technology, enabling people with paralysis to type on computer screens or manipulate robotic prostheses just by thinking about moving their own bodies. For years, investigational BCIs used in clinical trials have required cables to connect the sensing array in the brain to computers that decode the signals and use them to drive external devices.

New study points to potential widespread phagocytosis among green algae, suggests improved methodology in environmental microbiology.

New research suggests that the ability of green algae to eat bacteria is likely much more widespread than previously thought, a finding that could be crucial to environmental and climate science. The work, led by scientists at the American Museum of Natural History, Columbia University, and the University of Arizona, found that five strains of single-celled green algae consume bacteria when they are “hungry,” and only when those bacteria are alive. The study is published today in The ISME Journal.

“Traditionally, we think of green algae as being purely photosynthetic organisms, producing their food by soaking in sunlight,” said Eunsoo Kim, an associate curator at the American Museum of Natural History and one of the study’s corresponding authors. “But we’ve come to understand that there are potentially a number of species of green algae that also can eat bacteria when the conditions are right. And we’ve also found out just how finicky they are as eaters.”