An analog-digital approach to quantum simulation could lay the foundations for the next generation of supercomputers to finally outpace their classical predecessors.
Category: supercomputing – Page 3
This breakthrough overcomes a major challenge—scalability—by allowing small quantum devices to work together rather than trying to cram millions of qubits into a single machine. Using photonic links, they achieved quantum teleportation of logical gates across modules, essentially “wiring” them together. This distributed approach mirrors how supercomputers function, offering a flexible and upgradeable system.
First Distributed Quantum Computer
In a major step toward making quantum computing practical on a large scale, scientists at Oxford University Physics have successfully demonstrated distributed quantum computing for the first time. By connecting two separate quantum processors using a photonic network interface, they effectively created a single, fully integrated quantum computer. This breakthrough opens the door to solving complex problems that were previously impossible to tackle. Their findings were published today (February 5) in Nature.
OpenAI on Thursday said the U.S. National Laboratories will be using its latest artificial intelligence models for scientific research and nuclear weapons security.
Under the agreement, up to 15,000 scientists working at the National Laboratories may be able to access OpenAI’s reasoning-focused o1 series. OpenAI will also work with Microsoft, its lead investor, to deploy one of its models on Venado, the supercomputer at Los Alamos National Laboratory, according to a release. Venado is powered by technology from Nvidia and Hewlett-Packard Enterprise.
The field of quantum computing is advancing relentlessly: equipped with a performance that far exceeds that of our conventional PCs, the high-tech computers of the future will solve highly complex problems that have so far defeated even the largest supercomputers. And indeed, Chinese researchers have now made another breakthrough in the digital world of qubits – and with the Zuchongzhi 3.0, they have presented a quantum computer that even rivals Google’s Willow! But what can the new high-tech computer do? How does a quantum computer work anyway? And above all, how will the high-performance computers change our everyday lives?
“The projects running on Aurora represent some of the most ambitious and innovative science happening today,” said Katherine Riley, ALCF director of science. “From modeling extremely complex physical systems to processing huge amounts of data, Aurora will accelerate discoveries that deepen our understanding of the world around us.”
On the hardware side, Aurora clearly impresses. The supercomputer comprises 166 racks, each holding 64 blades, for a total of 10,624 blades. Each blade contains two Xeon Max processors with 64 GB of HBM2E memory onboard and six Intel Data Center Max ‘Ponte Vecchio’ GPUs, all cooled by a specialized liquid-cooling system.
In total, Aurora has 21,248 CPUs with over 1.1 million high-performance x86 cores, 19.9 PB of DDR5 memory, and 1.36 PB of HBM2E memory attached to the CPUs. It also features 63,744 GPUs optimized for AI and HPC equipped with 8.16 PB of HBM2E memory. Aurora uses 1,024 nodes with solid-state drives for storage, offering 220 PB of total capacity and 31 TB/s of bandwidth. The machine relies on HPE’s Shasta supercomputer architecture with Slingshot interconnects.
DNA computing uses programmable arrays to redefine diagnostics, offering scalable and efficient disease detection at the molecular level.
Scientists have created over a million simulated cosmic images using the power of supercomputers to anticipate the capabilities of NASA
NASA, the National Aeronautics and Space Administration, is the United States government agency responsible for the nation’s civilian space program and for aeronautics and aerospace research. Established in 1958 by the National Aeronautics and Space Act, NASA has led the U.S. in space exploration efforts, including the Apollo moon-landing missions, the Skylab space station, and the Space Shuttle program.
Spacecraft powered by electric propulsion could soon be better protected against their own exhaust, thanks to new supercomputer simulations.
Electric propulsion is a more efficient alternative to traditional chemical rockets, and it’s being increasingly used on space missions, starting off with prototypes on NASA’s Deep Space 1 and the European Space Agency’s SMART-1 in 1998 and 2003, respectively, and subsequently finding use on flagship science missions such as NASA’s Dawn and Psyche missions to the asteroid belt. There are even plans to use electric propulsion on NASA’s Lunar Gateway space station.
El Capitan can reach a peak performance of 2.746 exaFLOPS, making it the National Nuclear Security Administration’s first exascale supercomputer. It’s the world’s third exascale machine after the Frontier supercomputer at Oak Ridge National Laboratory in Tennessee and the Aurora supercomputer at the Argonne Leadership Computing Facility, also in Illinois.
The world’s fastest supercomputer is powered by more than 11 million CPU and GPU cores integrated into 43,000+ AMD Instinct MI300A accelerators. Each MI300A APU comprises an EPYC Genoa 24-core CPU clocked at 1.8GHz and a CDNA3 GPU integrated onto a single organic package, along with 128GB of HBM3 memory.
As the capabilities of generative AI models have grown, you’ve probably seen how they can transform simple text prompts into hyperrealistic images and even extended video clips.
More recently, generative AI has shown potential in helping chemists and biologists explore static molecules, like proteins and DNA. Models like AlphaFold can predict molecular structures to accelerate drug discovery, and the MIT-assisted “RFdiffusion,” for example, can help design new proteins.
One challenge, though, is that molecules are constantly moving and jiggling, which is important to model when constructing new proteins and drugs. Simulating these motions on a computer using physics—a technique known as molecular dynamics —can be very expensive, requiring billions of time steps on supercomputers.