Toggle light / dark theme

NeuRRAM, a new chip that runs computations directly in memory and can run a wide variety of AI applications has been designed and built by an international team of researchers. What sets it apart is that it does this all at a fraction of the energy consumed by computing platforms for general-purpose AI computing.

The NeuRRAM neuromorphic chip brings AI a step closer to running on a broad range of edge devices, disconnected from the cloud. This means they can perform sophisticated cognitive tasks anywhere and anytime without relying on a network connection to a centralized server. Applications for this device abound in every corner of the globe and every facet of our lives. They range from smartwatches to VR headsets, smart earbuds, smart sensors in factories, and rovers for space exploration.

Not only is the NeuRRAM chip twice as energy efficient as the state-of-the-art “compute-in-memory” chips, an innovative class of hybrid chips that runs computations in memory, it also delivers results that are just as accurate as conventional digital chips. Conventional AI platforms are much bulkier and typically are constrained to using large data servers operating in the cloud.

Our perception of our bodies is not always correct or realistic, as any athlete or fashion-conscious person knows, but it’s a crucial factor in how we behave in society. Your brain is continuously preparing for movement while you play ball or get dressed so that you can move your body without bumping, tripping, or falling.

Humans develop our body models as infants, and robots are starting to do the same. A team at Columbia Engineering revealed today that they have developed a robot that, for the first time, can learn a model of its whole body from scratch without any human aid. The researchers explain how their robot built a kinematic model of itself in a recent paper published in Science Robotics, and how it utilized that model to plan movements, accomplish objectives, and avoid obstacles in a range of scenarios. Even damage to its body was automatically detected and corrected.

The world’s first portable brain computer interface (BCI) is being developed by Blackrock and University of Pittsburgh so patients can undergo research trials at home. Rapid Robotics releases fastest and easiest robotic arm setup in requiring no code at all. New AI using light performs 1,000x faster at classifying data.

AI News Timestamps:
0:00 First Portable Brain Computer Interface.
2:29 Rapid Robotics Fastest Robot Arm Setup.
5:03 New AI Using Light Is 1,000x Faster.

👉 Crypto AI News: https://www.youtube.com/c/CryptoAINews/videos.

#ai #robot #news

❤️ Check out Runway and try it for free here: https://runwayml.com/papers/

📝 The paper “3D Face Reconstruction with Dense Landmarks” is available here:
https://microsoft.github.io/DenseLandmarks/

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bryan Learn, B Shang, Christian Ahlin, Eric Martel, Geronimo Moralez, Gordon Child, Ivo Galic, Jace O’Brien, Jack Lukic, John Le, Jonas, Jonathan, Kenneth Davis, Klaus Busse, Kyle Davis, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Michael Albrecht, Michael Tedder, Nevin Spoljaric, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: https://www.patreon.com/TwoMinutePapers.

Thumbnail background design: Felícia Zsolnai-Fehér — http://felicia.hu.

Károly Zsolnai-Fehér’s links:
Instagram: https://www.instagram.com/twominutepapers/
Twitter: https://twitter.com/twominutepapers.
Web: https://cg.tuwien.ac.at/~zsolnai/

An AI-driven chatbot technology has allowed one woman to answer questions from beyond the grave at her own funeral, with mourners able to dive into her fascinating life in a morbid but futuristic tribute. The technology was provided by her son, who runs a company that creates “holographic conversational video experiences”, and allowed Holocaust campaigner Marina Smith MBE to be “present, in a sense”, according to son Stephen Smith, reports the Telegraph.

Mrs Smith passed away in June of this year and her funeral was held shortly after in Nottinghamshire, UK. Having led a meaningful life educating people about the Holocaust, her family wished for her message to continue after her death, and the holographic experience during her funeral allowed just that.

The experience used StoryFile, an AI conversational bot that uses 20 different cameras and recordings of the subject to create a digital, holographic clone that can be interacted with. While the experience was powered by AI, the answers given to questions were entirely Mrs Smith’s own words.

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

I have been thinking about death lately. Not a lot — a little. Possibly because I recently had a month-long bout of Covid-19. And, I read a recent story about the passing of the actor Ed Asner, famous for his role as Lou Grant in “The Mary Tyler Moore Show.” More specifically, the story of his memorial service where mourners were invited to “talk” with Asner through an interactive display that featured video and audio that he recorded before he died. The experience was created by StoryFile, a company with the mission to make AI more human. According to the company, their proprietary technology and AI can match pre-recorded answers with future questions, allowing for a real-time yet asynchronous conversation.

In other words, it feels like a Zoom conversation with a living person.