Toggle light / dark theme

AI reveals that mice’s faces express a range of emotions — just like humans

AI has revealed that mice have a range of facial expressions that show they feel — offering fresh clues about how emotional responses arise in human brains.

Scientists at the Max Planck Institute of Neurobiology in Germany made the discovery by recording the faces of lab mice when they were exposed to different stimuli, such as sweet flavors and electric shocks. The researchers then used machine learning algorithms to analyze how the rodents’ faces changed when they experienced different feelings.

These brain-boosting devices could give us intelligence superpowers

Circa 2017


The era of merging our minds with technology has begun. Already, we can hack the brain to treat diseases such as Parkinson’s or help paralyzed people move again. But what if you could install a chip in your head that would not only fix any health issues, but could amp up your brainpower — would you remember every word said during a meeting, finish crossword puzzles faster, drive better thanks to enhanced senses, or pick up a new language before your next trip?

That’s the future envisioned by Elon Musk, the Tesla and SpaceX CEO who recently announced Neuralink, a new company dedicated to blending human brains with computers. In Musk’s view, we’ll have to keep pace with ever-smarter artificial intelligence by implanting a “neural lace,” or sci-fi inspired machine interface that will make us smarter.

“Under any rate of advancement in AI, we will be left behind by a lot,” Musk said last year. “The benign situation with ultra-intelligent AI is that we would be so far below in intelligence we’d be like a pet, or a house cat.”

How a New AI Translated Brain Activity to Speech With 97 Percent Accuracy

That vision may have come a step closer after researchers at the University of California, San Francisco demonstrated that they could translate brain signals into complete sentences with error rates as low as three percent, which is below the threshold for professional speech transcription.

While we’ve been able to decode parts of speech from brain signals for around a decade, so far most of the solutions have been a long way from consistently translating intelligible sentences. Last year, researchers used a novel approach that achieved some of the best results so far by using brain signals to animate a simulated vocal tract, but only 70 percent of the words were intelligible.

The key to the improved performance achieved by the authors of the new paper in Nature Neuroscience was their realization that there were strong parallels between translating brain signals to text and machine translation between languages using neural networks, which is now highly accurate for many languages.

A Robot Stand-Up Comedian Learns The Nuts And Bolts Of Comedy

Social roboticist, Heather Knight, sees robots and entertainment a research-rich coupling. So she programmed a charming humanoid robot named DATA with jokes, and equipped it with sensors and algorithmic capabilities to help with timing and gauging a crowd. Then Knight and DATA hit the road on an international robot stand-up comedy tour. Their act landed stage time at a TED conference and Knight was profiled in Forbes 30 Under 30. Watching Data perform is much like watching an amateur stand-up comedian cutting her/his chops at an open mic night doing light comedy with a sweet but wooden delivery.

Knight’s goal is specific:

AI techniques used to improve battery health and safety

Researchers have designed a machine learning method that can predict battery health with 10x higher accuracy than current industry standard, which could aid in the development of safer and more reliable batteries for electric vehicles and consumer electronics.

The researchers, from Cambridge and Newcastle Universities, have designed a new way to monitor batteries by sending electrical pulses into them and measuring the response. The measurements are then processed by a to predict the ’s health and useful lifespan. Their method is non-invasive and is a simple add-on to any existing battery system. The results are reported in the journal Nature Communications.

Predicting the state of health and the remaining useful lifespan of lithium-ion batteries is one of the big problems limiting widespread adoption of : it’s also a familiar annoyance to mobile phone users. Over time, battery performance degrades via a complex network of subtle chemical processes. Individually, each of these processes doesn’t have much of an effect on battery performance, but collectively they can severely shorten a battery’s performance and lifespan.

Alphabet’s DeepMind masters Atari games

In order to better solve complex challenges at the dawn of the third decade of the 21st century, Alphabet Inc. has tapped into relics dating to the 1980s: video games.

The parent company of Google reported this week that its DeepMind Technologies Artificial Intelligence unit has successfully learned how to play 57 Atari video games. And the plays better than any human.

Atari, creator of Pong, one of the first successful video games of the 1970s, went on to popularize many of the great early classic video games into the 1990s. Video games are commonly used with AI projects because they algorithms to navigate increasingly complex paths and options, all while encountering changing scenarios, threats and rewards.

This Startup’s Computer Chips Are Powered by Human Neurons

As of right now, Cortical’s mini-brains have less processing power than a dragonfly brain. The company is looking to get its mouse-neuron-powered chips to be capable of playing a game of “Pong,” as CEO Hon Weng Chong told Fortune, following the footsteps of AI company DeepMind, which used the game to test the power of its AI algorithms back in 2013.

“What we are trying to do is show we can shape the behavior of these neurons,” Chong told Fortune.

READ MORE: A startup is building computer chips using human neurons [Fortune].

Europe Gets One Step Closer To AI-Piloted Drones & eVTOL Aircraft

The biggest change worldwide in the last decade was probably the smartphone revolution, but overall, cities themselves still look pretty much the same. In the decade ahead, cities will change a lot more. Most of our regular readers probably think I am referring to how autonomous vehicles networks will start taking over and how owning a car will start to become closer to owning a horse. However, the real answer isn’t just the autonomous vehicles on the roads — they will likely also compete with autonomous eVTOL aircraft carrying people between hubs.

Today, the European Union is moving one step closer to making this second part a reality. Together with Daedalean, an autonomous flight company we have covered in the past, EASA published a new joint report covering “The Learning Assurance for Neural Networks.”