Toggle light / dark theme

In the search for less energy-hungry artificial intelligence, some scientists are exploring living computers.

By Jordan Kinard

Artificial intelligence systems, even those as sophisticated as ChatGPT, depend on the same silicon-based hardware that has been the bedrock of computing since the 1950s. But what if computers could be molded from living biological matter? Some researchers in academia and the commercial sector, wary of AI’s ballooning demands for data storage and energy, are focusing on a growing field known as biocomputing. This approach uses synthetic biology, such as miniature clusters of lab-grown cells called organoids, to create computer architecture. Biocomputing pioneers include Swiss company FinalSpark, which earlier this year debuted its “Neuroplatform”—a computer platform powered by human-brain organoids—that scientists can rent over the Internet for $500 a month.

More than one singularity.


The singularity could soon be upon us. The PESTLE framework, developed by this episode’s guest Daniel Hulme, expresses not one but six types of singularity that could occur: political, environmental, social, technological, legal and economic. ‪@JonKrohnLearns‬ and Daniel Hulme discuss how each of these singularities could bring good to the world, aligning with human interests and pushing forward progress. They also talk about neuromorphic computing, machine consciousness, and applying AI at work.

Watch the full interview “807: Superintelligence and the Six Singularities — with Dr. Daniel Hulme” here: https://www.superdatascience.com/807

Tesla is trying something new. The automaker is offering a bundle of 3 years of subscription to Full Self-Driving (FSD) Supervised, Supercharging, and premium connectivity.

Tesla has been having issues selling its FSD package.

For years, CEO Elon Musk claimed that Tesla would keep increasing prices as the system got better, which he claims would then make Tesla vehicles “appreciation assets”

Chinese technology firms started stockpiling Samsung’s high-bandwidth memory (HBM) chips earlier this year in anticipation that the United States would soon ban their export to China.

China accounted for about 30% of Samsung’s HBM chip revenues in the first half of this year, driven by rising demand from tech giants like Huawei and Baidu as well as new Chinese startups, Reuters reported, citing three unnamed sources. HBM chips are commonly used as artificial intelligence (AI) accelerators.

The Reuters report said most Chinese firms have sought in particular the HBM2E chip, which is one generation behind the HBM3 and two generations behind the most advanced HBM3E. China plans to produce indigenously the HBM2, the most mature, least advanced model.

But that doesn’t mean Frosst is bullish on everything the industry is building. He doesn’t think AI is really ever going to get to artificial general intelligence, defined as human-level intelligence, which is a noticeably different narrative from some of Frosst’s AI peers like Mark Zuckerberg and Jensen Huang. He added that if the industry does get there, it’s not going to be for a long time.

“I don’t think we’re gonna have digital gods anywhere, anytime soon,” Frosst said. “And I think more and more people are kind of coming to that realization, saying this technology is incredible. It’s super powerful, super useful. It’s not a digital god. And that requires adjusting how you’re thinking about the technology.”

Frosst said they try to be realistic at Cohere about what AI technology can and can’t do and what types of neural networks can provide the most value. Cohere’s approach to building its business model is based on the research work of Cohere co-founder and CEO Aidan Gomez while at Google Brain. Gomez is, of course, known for his extensive AI research. He’s most famous for co-writing a paper that bought AI the transformer model that ushered in this generative AI era. But he also co-wrote a paper in 2017 called One Model to Learn Them All. This research came to the conclusion that an all-encompassing large language model is more useful than small models trained for a specific task or on data from a specific industry, Frosst said.