Toggle light / dark theme

Training generative AI models is challenging. It requires an infrastructure that can move and process data with performance characteristics unheard of outside of traditional supercomputing environments. Nobody better understands the demands that AI puts on infrastructure than the service providers that specialize in the space.

Lambda and VAST Data have engaged in a new strategic partnership that brings the VAST Data Platform to Lambda. This follows similar announcements from CoreWeave and G42 Cloud, both of which unveiled similar relationships with VAST over the past few months. This makes VAST Data the top choice for dedicated AI service providers.


Lambda Labs and VAST Data have engaged in a new strategic partnership that brings the VAST Data Platform to Lambda Labs.

The company which describes itself as the data infrastructure company for AI, bagged a $249 million contract in 2022 to provide a range of AI tech to the US Department of Defence.

Traditionally, the United States has been viewed as the top dog in global military applications, but over the last three decades, it’s been facing competition from a strong opponent in the Indo-Pacific area. China has been bullishly making its space by modernizing its weapons and forces and denting the US’ dominance in developing advanced technologies.


Nastco/iStock.

Can the US come out on top in the AI arms race?

For the first time ever, researchers at the Surgical Robotics Laboratory of the University of Twente successfully made two microrobots work together to pick up, move and assemble passive objects in 3D environments. This achievement opens new horizons for promising biomedical applications.

Imagine you need surgery somewhere inside your body. However, the part that needs surgery is very difficult for a surgeon to reach. In the future, a couple of robots smaller than a grain of salt might go into your body and perform the surgery. These microrobots could work together to perform all kinds of complex tasks. “It’s almost like magic,” says Franco Piñan Basualdo, corresponding author of the publication.

Researchers from the University of Twente successfully exploited two of these 1-millimeter-sized magnetic microrobots to perform several operations. Like clockwork, the microrobots were able to pick up, move and assemble cubes. Unique to this achievement is the 3D environment in which the robots performed their tasks.

Its energy efficiency is just mind-blowing,” said Damien Querlioz, a nanoelectronics researcher at the University of Paris-Saclay in Palaiseau. “I feel the paper will shake the common thinking in computer architecture.


NorthPole, a new edge-based processor announced this month by IBM Research, is up to 22 times faster and much more energy efficient than chips currently on the market.

A team from IBM research has presented NorthPole – a brain-inspired chip architecture, which blends computation with memory to process data more efficiently at low energy costs.

It has become nearly impossible for human researchers to keep track of the overwhelming abundance of scientific publications in the field of artificial intelligence and to stay up-to-date with advances.

Scientists in an international team led by Mario Krenn from the Max-Planck Institute for the Science of Light have now developed an AI algorithm that not only assists researchers in orienting themselves systematically but also predictively guides them in the direction in which their own research field is likely to evolve. The work was published in Nature Machine Intelligence.

In the field of artificial intelligence (AI) and (ML), the number of is growing exponentially and approximately doubling every 23 months. For human researchers, it is nearly impossible to keep up with progress and maintain a comprehensive overview.

The influence of language on human thinking could be stronger than previously assumed. This is the result of a new study by Professor Friedemann Pulvermüller and his team from the Brain Language Laboratory at Freie Universität Berlin. In this study, the researchers examined the modeling of human concept formation and the impact of language mechanisms on the emergence of concepts. The results were recently published in the journal Progress in Neurobiology under the title “Neurobiological Mechanisms for Language, Symbols, and Concepts: Clues from Brain-Constrained Deep Neural Networks” (accessible online at https://www.sciencedirect.com/science/article/pii/S0301008223001120?via%3Dihub).

Children can learn one or more languages with little effort. However, the cognitive activity involved should not be underestimated. Not only do language learners have to learn how to pronounce words, they must also learn how to interlink word forms with content – with concepts such as “coffee,” “drinking,” or “beauty.” But what are the actual mechanisms at work in the network of billions of nerve cells within our brains? And might the learning of some concepts strictly require the presence of language?

Modern computer models—for example for complex, potent AI applications—push traditional digital computer processes to their limits. New types of computing architecture, which emulate the working principles of biological neural networks, hold the promise of faster, more energy-efficient data processing.

A team of researchers has now developed a so-called event-based architecture, using photonic processors with which data are transported and processed by means of light. In a similar way to the brain, this makes possible the continuous adaptation of the connections within the neural network. This changeable connections are the basis for learning processes.

For the purposes of the study, a team working at Collaborative Research Center 1,459 (Intelligent Matter)—headed by physicists Prof. Wolfram Pernice and Prof. Martin Salinga and computer specialist Prof. Benjamin Risse, all from the University of Münster—joined forces with researchers from the Universities of Exeter and Oxford in the UK. The study has been published in the journal Science Advances.