Toggle light / dark theme

The New York Times reports Google founders Larry Page and Sergey Brin have discussed its response to ChatGPT, with plans to launch over 20 AI products this year, including a demo of its own search chatbot.

The recent launch of OpenAI’s AI chatbot ChatGPT has raised alarms within Google, according to reports from The.

As recently as December, we’d heard Google execs were worried that despite investing heavily in AI technology, moving too fast to roll it out could harm the company’s reputation.


Expect to see a Google search AI chatbot demo this year.

The wetware in a casket of bone that we each carry on our shoulders is 1 million times more efficient than the AI models run by services like ChatGPT, Stable Diffusion, or DALL-E.

In this TechFirst with John Koetsier we chat for a second time with Gordon Wilson, CEO of Rain AI, which is building a neuromorphic artificial brain simulating the structure of our biological brains, and aiming at 10,000 to 100,000 greater energy efficiency than current AI architectures.

We also discuss “mortal computation” and a radical co-design of the hardware and software for AI systems, which could lead to much more efficient (and more effective) smart tools, machines, and companions.

Links:
TechFirst transcripts: https://johnkoetsier.com/category/tech-first/

Forbes columns: https://www.forbes.com/sites/johnkoetsier/

Keep in touch: https://twitter.com/johnkoetsier

OpenAI has sparked an explosion of funding and software development around artificial-intelligence software that understands human language. While the technology still makes plenty of mistakes, new applications are coming out in droves, from tools that help marketers write copy to audio chatbots that may be able to negotiate discounts for customers of a companies like Comcast.

Last week, subscribers of The Information joined a conference call about the year ahead in AI with Noam Shazeer, CEO of Character, which is developing chatbots similar to OpenAI’s ChatGPT and who co-authored a seminal research paper on that subject while working at Google; and Clement Delangue, CEO of Hugging Face, which runs a Github-like service for software engineers to store their machine learning models.

Connor Leahy from Conjecture joins the podcast to discuss AI progress, chimps, memes, and markets. Learn more about Connor’s work at https://conjecture.dev.

Timestamps:
00:00 Introduction.
01:00 Defining artificial general intelligence.
04:52 What makes humans more powerful than chimps?
17:23 Would AIs have to be social to be intelligent?
20:29 Importing humanity’s memes into AIs.
23:07 How do we measure progress in AI?
42:39 Gut feelings about AI progress.
47:29 Connor’s predictions about AGI
52:44 Is predicting AGI soon betting against the market?
57:43 How accurate are prediction markets about AGI?

Summary: Combining neuroimaging and EEG data, researchers recorded the neural activity of people while listening to a piece of music. Using machine learning technology, the data was translated to reconstruct and identify the specific piece of music the test subjects were listening to.

Source: University of Essex.

A new technique for monitoring brain waves can identify the music someone is listening to.

Summary: A newly developed machine learning model can predict the words a person is about to speak based on their neural activity recorded by a minimally invasive neuroprosthetic device.

Source: HSE

Researchers from HSE University and the Moscow State University of Medicine and Dentistry have developed a machine learning model that can predict the word about to be uttered by a subject based on their neural activity recorded with a small set of minimally invasive electrodes.

Boston Dynamics’ Atlas—the world’s most advanced humanoid robot—is learning some new tricks. The company has finally given Atlas some proper hands, and in Boston Dynamics’ latest YouTube video, Atlas is attempting to do some actual work. It also released another behind-the-scenes video showing some of the work that goes into Atlas. And when things don’t go right, we see some spectacular slams the robot takes in its efforts to advance humanoid robotics.

As a humanoid robot, Atlas has mostly been focused on locomotion, starting with walking in a lab, then walking on every kind of unstable terrain imaginable, then doing some sick parkour tricks. Locomotion is all about the legs, though, and the upper half seemed mostly like an afterthought, with the arms only used to swing around for balance. Atlas previously didn’t even have hands— the last time we saw it, there were only two incomplete-looking ball grippers at the end of its arms.

This newest iteration of the robot has actual grippers. They’re simple clamp-style hands with a wrist and a single moving finger, but that’s good enough for picking things up. The goal of this video is moving “inertially significant” objects—not just picking up light boxes, but objects that are so heavy they can throw Atlas off-balance. This includes things like a big plank, a bag full of tools, and a barbell with two 10-pound weights. Atlas is learning all about those “equal and opposite forces” in the world.