Toggle light / dark theme

Recent developments like DALLE-2 and LaMDA are impressive and seem ready for impact. Is AI ready to change the world?

Whether you love, fear, or have mixed feelings about the future of artificial intelligence, the cultural fixation on the subject over the past decade has made it feel like the technology’s meteoric impact is just around the corner. The problem is that it is always just around the corner, yet never seems to arrive. Many hype-filled years have passed us by since the releases of Ex Machina (2014) and Westworld (2016), but it feels like we are still waiting on AI’s big splash. However, a handful of recent developments—specifically, OpenAI’s unveiling of GPT-3 and DALLE-2, and Google’s LaMDA controversy—have unleashed a new wave of excitement—and terror—around the possibility that AI’s game-changing moment is finally here.

There are several reasons why it feels it has taken a long time for AI projects to bear fruit. One is that pop culture seems almost exclusively focused on the possible endgames of the technology, rather than its broader journey. This isn’t much of a surprise. When we stream the latest sci-fi movie or binge Black Mirror episodes, we want to see killer robots and computer chip brain implants. No one is buying a ticket to see a movie about the slow, incremental rollout of existing technology—not unless it mutates and starts killing within the first 30 minutes. But while AI’s more futuristic forms are naturally the most entertaining, and provide an endless source of material for screenwriters, anyone who based their expectations for AI off of Bladerunner has got to be feeling disappointed by now.

“If cells are dreaming, [these images] are what the cells are dreaming about,” neuroscientist Carlos Ponce told The Atlantic. “It exposes the visual vocabulary of the brain, in a way that’s unbiased by our anthropomorphic perspective.”

Some neurons responded to images that vaguely resembled objects that the scientists recognized, suggesting that the researchers identified the specific neurons that corresponded with particular real-world objects. A blur that resembled a monkey’s face accompanied by a red blotch may have corresponded to another monkey in the lab that wore a red collar. Another blur that resembled a human wearing a surgical mask may have represented the woman who took care of and fed the lab’s monkeys, who wore a similar mask.

Other images that the monkey neurons responded to the most were less realistic, instead taking the form of various streaks and splotches of color, according to The Atlantic.

Circa 2020 This shape changing metal discovery can lead us closer to foglet machines.


Department of Chemical Engineering and Materials Science, Stevens Institute of Technology, Hoboken, NJ, USA. E-mail: [email protected]

Received 25th August 2020, Accepted 16th November 2020.

Metal halide perovskites (MHPs) are frontrunners among solution-processable materials for lightweight, large-area and flexible optoelectronics. These materials, with the general chemical formula AMX3, are structurally complex, undergoing multiple polymorph transitions as a function of temperature and pressure. In this review, we provide a detailed overview of polymorphism in three-dimensional MHPs as a function of composition, with A = Cs+, MA+, or FA+, M = Pb2+ or Sn2+, and X = Cl, Br, or I. In general, perovskites adopt a highly symmetric cubic structure at elevated temperatures. With decreasing temperatures, the corner-sharing MX6 octahedra tilt with respect to one another, resulting in multiple polymorph transitions to lower-symmetry tetragonal and orthorhombic structures. The temperatures at which these phase transitions occur can be tuned via different strategies, including crystal size reduction, confinement in scaffolds and (de-)pressurization.

Eavesdropping on the earliest conversations between tissues in an emerging life could tell us a lot about organ growth, fertility, and disease in general. It could help prevent early miscarriages, or even tell us how to grow whole replacement organs from scratch.

In a monumental leap in stem cell research, an experiment led by researchers from the University of Cambridge in the UK has developed a living model of a mouse embryo complete with fluttering heart tissues and the beginnings of a brain.

The research advances the recent success of a team comprised of some of the same scientists who pushed the limits on mimicking the embryonic development of mice using stem cells that had never seen the inside of a mouse womb.

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Artificial intelligence (AI) systems must understand visual scenes in three dimensions to interpret the world around us. For that reason, images play an essential role in computer vision, significantly affecting quality and performance. Unlike the widely available 2D data, 3D data is rich in scale and geometry information, providing an opportunity for a better machine-environment understanding.

Data-driven 3D modeling, or 3D reconstruction, is a growing computer vision domain increasingly in demand from industries including augmented reality (AR) and virtual reality (VR). Rapid advances in implicit neural representation are also opening up exciting new possibilities for virtual reality experiences.

Even during such routine tasks as a daily stroll, our brain sometimes needs to shift gears, switching from navigating the city to jumping out of the way of a bike or to crossing the street to greet a friend. These switches pose a challenge: How do the brain’s circuits deal with such dynamic and abrupt changes in behavior? A Weizmann Institute of Science study on bats, published today in Nature, suggests an answer that does not fit the classical thinking about brain function.

“Most research projects focus on one type of behavior at a time, so little is known about the way the brain handles dynamically changing behavioral needs,” says Prof. Nachum Ulanovsky of Weizmann’s Brain Sciences Department. In the new study, he and his team designed an experimental setup that mimicked real-life situations in which animals or humans rapidly switch from one behavior to another—for example, from navigation to avoiding a predator or a car crash. Graduate students Dr. Ayelet Sarel, Shaked Palgi and Dan Blum led the study, in collaboration with postdoctoral fellow Dr. Johnatan Aljadeff. The study was supervised by Ulanovsky together with Associate Staff Scientist Dr. Liora Las.

Using miniature wireless recording devices, the researchers monitored neurons in the brains of pairs of that had to avoid colliding with one another while flying toward each other along a 135-meter-long tunnel at the high speed of 7 meters per second. This amounted to a relative speed—that is, the rate at which the distance between the bats closed, or the sum of both bats’ speeds—of 14 meters per second, or about 50 kilometers an hour.