Toggle light / dark theme

A team of NUS researchers led by Associate Professor Lu Jiong from the Department of Chemistry and Institute for Functional Intelligent Materials, together with their international collaborators, have developed a novel concept of a chemist-intuited atomic robotic probe (CARP).

This innovation, which uses artificial intelligence (AI) to mimic the decision-making process of chemists, enables the manufacturing of quantum materials with unrivaled intelligence and precision for future quantum technology applications such as data storage and quantum computing.

Open-shell magnetic nanographene is a type of carbon-based quantum material that possesses key electronic and that are important for developing extremely fast electronic devices at the , or creating quantum bits, the building blocks of quantum computers. The processes used to develop such materials have progressed over the years due the discovery of a new type of solid-phase chemical reaction known as on-surface synthesis.

She continued: “We’ve certainly had more opportunities to target in the last 60 to 90 days,” adding the US is currently looking for “an awful lot” of rocket launchers in the region.

Moore’s comments provide some of the strongest evidence to date that the US military is using AI targeting systems to identify potential strike areas. She noted that even after Google walked away from the project, experimenting has continued with drone or satellite imagery.

Based at Central Command, or Centcom headquarters in Tampa, Florida, Moore revealed that US forces in the Middle East have been testing AI targeting systems using a combination of satellites and other data sources and conducted exercises over the past year with the technology.

“There’s a connection between the shape of the ice shell and the temperature in the ocean,” said Dr. Britney Schmidt. “This is a new way to get more insight from ice shell measurements that we hope to be able to get for Europa and other worlds.”


While Earth remains the only known world with bodies of liquid water on its surface, there are a myriad of worlds within our own solar system that have liquid water oceans beneath thick surfaces of ice. But what is the temperature of those interior oceans, and could the thickness of its ice shell determine it? This is what a recent study published in Journal of Geophysical Research Planets hopes to address as a team of researchers led by Cornell University investigated how a process called “ice pumping” could determine the temperature of the interior ocean underneath thick icy shells, also known as ice-ocean interaction. This study holds the potential to help researchers better understand the conditions for finding life beyond Earth with a focus on Jupiter’s moon, Europa, and Saturn’s moon, Enceladus.

“If we can measure the thickness variation across these ice shells, then we’re able to get temperature constraints on the oceans, which there’s really no other way yet to do without drilling into them,” said Dr. Britney Schmidt, who is an Associate Professor of Astronomy & Earth and Atmospheric Sciences at Cornell University and a co-author on the study. “This gives us another tool for trying to figure out how these oceans work. And the big question is, are things living there, or could they?”

For the study, the researchers used robotic observations obtained at Antarctica’s Ross Ice Shelf and computer models to analyze how “ice pumping”, which occurs in water underneath ice sheets and based on an ice shell’s slope, could help regulate ocean temperature when accounting for pressure and salt content, as well. The goal was to ascertain the potential behavior of ice-ocean interaction on Jupiter’s moon, Europa, and Saturn’s moon, Enceladus, both of which possess interior oceans and are targets for astrobiologists searching for life beyond Earth.

Some of Apple’s biggest investors are set to pressure the company tomorrow to reveal its use of artificial intelligence tools (via the Financial Times).

Apple’s annual shareholder meeting takes place tomorrow, allowing those with a major stake in the company to put forward proposals. One resolution proposed by the American Federation of Labor and Congress of Industrial Organizations (AFL-CIO) asks Apple to disclose its use of AI and any ethical guidelines that the company has adopted regarding the technology.

Researchers at EPFL have developed a new, uniquely modular machine learning model for flexible decision-making. It is able to input any mode of text, video, image, sound, and time-series and then output any number, or combination, of predictions.

We’ve all heard of , or LLMs—massive scale trained on huge amounts of text that form the basis for chatbots like OpenAI’s ChatGPT. Next-generation multimodal models (MMs) can learn from inputs beyond text, including video, images, and sound.

Creating MM models at a smaller scale poses significant challenges, including the problem of being robust to non-random missing information. This is information that a model doesn’t have, often due to some biased availability in resources. It is thus critical to ensure the model does not learn the patterns of biased missingness in making its predictions.

Elon Musk claims OpenAI is using GPT-4 to ‘maximize profits’ instead of ‘for the benefit of humanity.’


The lawsuit claims that the GPT-4 model OpenAI released in March 2023 isn’t just capable of reasoning but is also actually “better at reasoning than average humans,” having scored in the 90th percentile on the Uniform Bar Examination for lawyers. The company is rumored to be developing a more advanced model, known as “Q Star,” that has a stronger claim to being true artificial general intelligence (AGI).

Altman was fired (and subsequently rehired five days later) by OpenAI in 2023 over vague claims that his communication with the board was “hindering its ability to exercise its responsibilities.” The lawsuit filed by Musk alleges that in the days following this event, Altman, Brockman, and Microsoft “exploited Microsoft’s significant leverage over OpenAI” to replace board members with handpicked alternatives that were better approved of by Microsoft.

“The new Board members lack substantial AI expertise and, on information and belief, are ill equipped by design to make an independent determination of whether and when OpenAI has attained AGI — and hence when it has developed an algorithm that is outside the scope of Microsoft’s license,” claims the lawsuit. The partnership between OpenAI and Microsoft is currently being examined by regulators in the UK, EU, and US to assess if their shared relationship impacts competition.

With an emphasis on AI-first strategy and improving Google Cloud databases’ capability to support GenAI applications, Google announced developments in the integration of generative AI with databases.


AWS offers a broad range of services for vector database requirements, including Amazon OpenSearch Service, Amazon Aurora PostgreSQL-Compatible Edition, Amazon RDS for PostgreSQL, Amazon Neptune ML, and Amazon MemoryDB for Redis. AWS emphasizes the operationalization of embedding models, making application development more productive through features like data management, fault tolerance, and critical security features. AWS’s strategy focuses on simplifying the scaling and operationalization of AI-powered applications, providing developers with the tools to innovate and create unique experiences powered by vector search.

Azure takes a similar approach by offering vector database extensions to existing databases. This strategy aims to avoid the extra cost and complexity of moving data to a separate database, keeping vector embeddings and original data together for better data consistency, scale, and performance. Azure Cosmos DB and Azure PostgreSQL Server are positioned as services that support these vector database extensions. Azure’s approach emphasizes the integration of vector search capabilities directly alongside other application data, providing a seamless experience for developers.

Google’s move towards native support for vector storage in existing databases simplifies building enterprise GenAI applications relying on data stored in the cloud. The integration with LangChain is a smart move, enabling developers to instantly take advantage of the new capabilities.

Punyo is a soft robot designed to improve whole-body manipulation research by employing its arms and chest.

Toyota seems to be diving into more than just cars.


Toyota developed Punyo, a soft robot with capabilities that amplify, rather than replace people, helps lift heavy objects or move furniture.