Toggle light / dark theme

Though Meta didn’t give numbers on RSC’s current top speed, in terms of raw processing power it appears comparable to the Perlmutter supercomputer, ranked fifth fastest in the world. At the moment, RSC runs on 6,800 NVIDIA A100 graphics processing units (GPUs), a specialized chip once limited to gaming but now used more widely, especially in AI. Already, the machine is processing computer vision workflows 20 times faster and large language models (like, GPT-3) 3 times faster. The more quickly a company can train models, the more it can complete and further improve in any given year.

In addition to pure speed, RSC will give Meta the ability to train algorithms on its massive hoard of user data. In a blog post, the company said that they previously trained AI on public, open-source datasets, but RSC will use real-world, user-generated data from Meta’s production servers. This detail may make more than a few people blanch, given the numerous privacy and security controversies Meta has faced in recent years. In the post, the company took pains to note the data will be carefully anonymized and encrypted end-to-end. And, they said, RSC won’t have any direct connection to the larger internet.

To accommodate Meta’s enormous training data sets and further increase training speed, the installation will grow to include 16,000 GPUs and an exabyte of storage—equivalent to 36,000 years of high-quality video—later this year. Once complete, Meta says RSC will serve training data at 16 terabytes per second and operate at a top speed of 5 exaflops.

Quantum researchers at the University of Bristol have dramatically reduced the time to simulate an optical quantum computer, with a speedup of around one billion over previous approaches.

Quantum computers promise exponential speedups for certain problems, with potential applications in areas from drug discovery to new materials for batteries. But is still in its early stages, so these are long-term goals. Nevertheless, there are exciting intermediate milestones on the journey to building a useful device. One currently receiving a lot of attention is “”, where a quantum computer performs a task beyond the capabilities of even the world’s most powerful supercomputers.

Experimental work from the University of Science and Technology of China (USTC) was the first to claim quantum advantage using photons—particles of light, in a protocol called “Gaussian Boson Sampling” (GBS). Their paper claimed that the experiment, performed in 200 seconds, would take 600 million years to simulate on the world’s largest supercomputer.

Could it really happen?Looks like Meta is swinging for the cheap seats.


Looks like Meta is swinging for the cheap seats.

The social media superpower Meta (formerly Facebook) has announced that it has built an “AI supercomputer” — an unconscionably fast computer designed to train and enhance machine-learning systems, according to a Monday post from Meta CEO Mark Zuckerberg.

“Meta has developed what we believe is the world’s fastest AI supercomputer,” said Zuckerberg in his post. “We’re calling it RSC for AI Research SuperCluster and it’ll be complete later this year.”

Meta says it wants to build the most powerful artificial intelligence supercomputer in the world.

The Facebook owner has already designed and built what it calls the AI Research SuperCluster, or RSC, which it says is among the fastest AI supercomputers in the world.

It hopes to top that league by mid-2022, it said, in what would be a major step towards increasing its artificial intelligence capabilities.

Has the first phase of a new AI. Once the AI Research SuperCluster (RSC) is fully built out later this year, the company believes it will be the fastest AI supercomputer on the planet, capable of “performing at nearly 5 exaflops of mixed precision compute.”

The company says RSC will help researchers develop better AI models that can learn from trillions of examples. Among other things, the models will be able to build better augmented reality tools and “seamlessly analyze text, images and video together,” according to Meta. Much of this work is in service of its vision for the metaverse, in which it says AI-powered apps and products will have a key role.

“We hope RSC will help us build entirely new AI systems that can, for example, power real-time voice translations to large groups of people, each speaking a different language, so they can seamlessly collaborate on a research project or play an AR game together,” technical program manager Kevin Lee and software engineer Shubho Sengupta wrote.

What’s next? Human brain-scale AI.

Funded by the Slovakian government using funds allocated by the EU, the I4DI consortium is behind the initiative to build a 64 AI exaflop machine (that’s 64 billion, billion AI operations per second) on our platform by the end of 2022. This will enable Slovakia and the EU to deliver for the first time in the history of humanity a human brain-scale AI supercomputer. Meanwhile, almost a dozen other countries are watching this project closely, with interest in replicating this supercomputer in their own countries.

There are multiple approaches to achieve human brain-like AI. These include machine learning, spiking neural networks like SpiNNaker, neuromorphic computing, bio AI, explainable AI and general AI. Multiple AI approaches require universal supercomputers with universal processors for humanity to deliver human brain-scale AI.

Official launch marks a milestone in the development of quantum computing in Europe.

A quantum annealer with more than 5,000 qubits has been put into operation at Forschungszentrum Jülich. The Jülich Supercomputing Centre (JSC) and D-Wave Systems, a leading provider of quantum computing systems, today launched the company’s first cloud-based quantum service outside North America. The new system is located at Jülich and will work closely with the supercomputers at JSC in the future. The annealing quantum computer is part of the Jülich UNified Infrastructure for Quantum computing (JUNIQ), which was established in autumn 2019 to provide researchers in Germany and Europe with access to various quantum systems.

Light-matter interactions form the basis of many important technologies, including lasers, light-emitting diodes (LEDs), and atomic clocks. However, usual computational approaches for modeling such interactions have limited usefulness and capability. Now, researchers from Japan have developed a technique that overcomes these limitations.

In a study published this month in The International Journal of High Performance Computing Applications, a research team led by the University of Tsukuba describes a highly efficient method for simulating light-matter interactions at the atomic scale.

What makes these interactions so difficult to simulate? One reason is that phenomena associated with the interactions encompass many areas of physics, involving both the propagation of light waves and the dynamics of electrons and ions in matter. Another reason is that such phenomena can cover a wide range of length and time scales.

Chromium defects in silicon carbide may provide a new platform for quantum information.

Quantum computers may be able to solve science problems that are impossible for today’s fastest conventional supercomputers. Quantum sensors may be able to measure signals that cannot be measured by today’s most sensitive sensors. Quantum bits (qubits) are the building blocks for these devices. Scientists are investigating several quantum systems for quantum computing and sensing applications. One system, spin qubits, is based on the control of the orientation of an electron’s spin at the sites of defects in the semiconductor materials that make up qubits. Defects can include small amounts of materials that are different from the main material a semiconductor is made of. Researchers recently demonstrated how to make high quality spin qubits based on chromium defects in silicon carbide.