Toggle light / dark theme

Taichi could potentially make artificial general intelligence a reality.

Researchers at Tsinghua University in China have developed a revolutionary new artificial intelligence (AI) chip that uses light instead of electricity to process data.


Researchers have developed a highly energy-efficient photonic AI chip called Taichi, which could accelerate the development of advanced computing solutions.

The highest-level Hidra chip will power Apple’s highest-end desktop, the Mac Pro. To give you an idea of how powerful the chip might be, Apple is working to support the Mac Pro with 512 GB RAM. Unlike Intel’s processor, which allows additional memory to be added later, Apple Silicon is more deeply integrated into the processor, and therefore, the RAM options will have come from Apple itself.

With the M4 chip nearing production, Apple could unveil its new and updated Mac lineup later this year and follow it up with releases through 2025, the Bloomberg report added.

A newer lineup also gives the Cupertino-based company an opportunity to join the league of tech giants working on AI. Compared to Microsoft and Google, which have already released their AI-powered products, Apple has been a laggard in the AI space.

The robot is controlled by a neural network trained in deep reinforcement learning via simulation.

Students at ETH Zurich are creating a robot that can move around in extremely low gravity by hopping like a human.


ETH Zurich students are developing a robot that can hop like humans for exploring challenging terrains featuring ultra-low gravity.

The design is intended to provide more computing power, bandwidth, and memory capacity to the chips. Initially, Meta aimed to perform inference functions such as ranking and generating responses to user prompts. Meta plans to use the chips for more intense operations, such as training AI models using large data sets.

A shift to its chips could help Meta save millions in energy costs every year, alongside the billions needed in capital expenditure to buy chips from Nvidia.

Meta isn’t the only tech company looking to design and build its own AI chips. Legacy chipmaker Intel, which has lagged in catering to industry requirements for AI chips, also announced its new Gaudi chips at an event on Tuesday.

Tesla CEO Elon Musk — who has an abysmal track record for making predictions — is predicting that we will achieve artificial general intelligence (AGI) by 2026.

“If you define AGI as smarter than the smartest human, I think it’s probably next year, within two years,” he told Norway wealth fund CEO Nicolai Tangen during an interview this week, as quoted by Reuters.

The mercurial billionaire also attempted to explain why his own AI venture, xAI, has been falling behind the competition. According to Musk, a shortage of chips was hampering his startup’s efforts to come up with the successor of Grok, a foul-mouthed, dad joke-generating AI chatbot.

Artificial intelligence excels at sorting through information and detecting patterns or trends. But these machine learning algorithms need to be trained with large amounts of data first.

As researchers explore potential applications for AI, they have found scenarios where AI could be really useful—such as analyzing X-ray image data to look for evidence of rare conditions or detecting a rare fish species caught on a commercial fishing boat—but there’s not enough data to accurately train the algorithms.

Jenq-Neng Hwang, University of Washington professor of electrical and computer and engineering, specializes in these issues. For example, Hwang and his team developed a method that teaches AI to monitor how many distinct poses a baby can achieve throughout the day. There are limited training datasets of babies, which meant the researchers had to create a unique pipeline to make their algorithm accurate and useful.

Artificial intelligence startup Symbolica AI launched today with an original approach to building generative AI models.

The company is aiming to tackle the expensive mechanisms behind training and deploying large language models such as OpenAI’s ChatGPT that are based on Transformer architecture.

Alongside that news, it also revealed today that it has raised $33 million in total funding combined from a Series A and seed funding round led by Khosla Ventures. Other investors included Day One Ventures, Abstract Ventures Buckley Ventures and General Catalyst.