Toggle light / dark theme

With the successful development of the Jiuzhang 3.0 quantum computer prototype, which makes use of 255 detected photons, China continues to hold a world-leading position in the field of quantum computer research and development, lead scientists for the program told the Global Times on Wednesday.

The research team, composed of renowned quantum physicists Pan Jianwei and Lu Chaoyang from the University of Science and Technology of China in collaboration with the Shanghai Institute of Microsystem and Information Technology under the Chinese Academy of Sciences and the National Parallel Computer Engineering Technology Research Center, announced the successful construction of a 255-photon-based prototype quantum computer named Jiuzhang 3.0 early Wednesday morning.

The quantum computing feat accomplished by the team of talents achieves a speed that is 10 quadrillion times faster in solving Gaussian boson sampling (GBS) problems compared with the world’s fastest supercomputers.

Absorption spectroscopy is an analytical chemistry tool that can determine if a particular substance is present in a sample by measuring the intensity of the light absorbed as a function of wavelength. Measuring the absorbance of an atom or molecule can provide important information about electronic structure, quantum state, sample concentration, phase changes or composition changes, among other variables, including interaction with other molecules and possible technological applications.

Molecules with a high probability of simultaneously absorbing two photons of low-energy light have a wide array of applications: in molecular probes for , as a substrate for data storage in dense three-dimensional structures, or as vectors in medicinal treatments, for example.

Studying the phenomenon by means of direct experimentation is difficult, however, and computer simulation usually complements spectroscopic characterization. Simulation also provides a microscopic view that is hard to obtain in experiments. The problem is that simulations involving relatively require several days of processing by supercomputers or months by conventional computers.

The supercomputer which is under construction is 50 times more powerful that existing supercomputer at the facility.

The world’s most powerful supercomputer, Aurora, is being set up in the US to help scientists at the Argonne National Laboratory (ANL) simulate new nuclear reactors that are more efficient and safer than their predecessors, a press release said.

The US is already home to some of the world’s fastest supercomputers, as measured by TOP500. These supercomputers can be tasked with a variety of computational roles. Last month, Interesting Engineering reported how the Los Alamos National Laboratory (LANL) planned to use a supercomputer to check nuclear stockpiles for the US military.

To make its weather predictions, it analyzes 60 million daily observations from satellite, aircraft, and ground-based reports, using what we know about atmospheric physics to determine what the weather is likely to be like across the globe over the next 15 days.

This can literally save lives — if people know in advance that hurricanes or winter storms are heading their way, they can take action to prepare — but because the model is so complex, it must be run on a supercomputer over the course of several hours, which also makes it expensive.

The AIs: AI-based weather forecasting models are starting to catch up with traditional ones, like the European Model.

Year 2018 face_with_colon_three


Computers are shrinking rapidly. You can build a pretty capable little machine powered by a device like the Raspberry Pi, but that’s still huge compared with IBM’s latest machine. The company that started out selling massive mainframe computers has developed the world’s smallest computer (Opens in a new window). Each one is smaller than a grain of salt, but it packs more computing power than you’d expect.

The micro-computer is a complete system-on-a-chip (SoC) with a processor, memory, storage, and a communication module. The CPU contains several hundred thousand transistors, and IBM says it’s capable of performance on par with an x86 CPU from 1990. That’s not very fast compared with even the slowest modern computers, but it’s impressive for something you can’t see without a magnifying glass. It makes more sense when you look at the impressive developments in other SoC designs. The latest Qualcomm Snapdragon chips are about 1 square centimeter and have more processing power than supercomputers from the early 90s.

The chip is just a prototype right now, but IBM has big plans (Opens in a new window) for this (literally) microscopic computer. It’s touting this as a significant advancement for blockchain technology, but not the same blockchain that’s used to track Bitcoin transactions. A blockchain is merely a distributed ledger that can be used for various purposes. IBM and other companies have been looking for ways to use blockchains without the cryptocurrency attached.

To build the supercomputer that powers OpenAI’s projects, Microsoft says it linked together thousands of Nvidia graphics processing units (GPUs) on its Azure cloud computing platform. In turn, this allowed OpenAI to train increasingly powerful models and “unlocked the AI capabilities” of tools like ChatGPT and Bing.

Scott Guthrie, Microsoft’s vice president of AI and cloud, said the company spent several hundreds of millions of dollars on the project, according to a statement given to Bloomberg. And while that may seem like a drop in the bucket for Microsoft, which recently extended its multiyear, multibillion-dollar investment in OpenAI, it certainly demonstrates that it’s willing to throw even more money at the AI space.

CAMBRIDGE, Mass. — Researchers at MIT have achieved a significant breakthrough in quantum computing, bringing the potential of these incredible thinking machines closer to realization. Quantum computers promise to handle calculations far too complex for current supercomputers, but many hurdles remain. A primary challenge is addressing computational errors faster than they arise.

In a nutshell, quantum computers find better and quicker ways to solve problems. Scientists believe quantum technology could solve extremely complex problems in seconds, while traditional supercomputers you see today could need months or even years to crack certain codes.

What makes these next generation supercomputers different from your everyday smartphone and laptop is how they process data. Quantum computers harness the properties of quantum physics to store data and perform their functions. While traditional computers use “bits” (either a 1 or a 0) to encode information on your devices, quantum technology uses “qubits.”

A team led by Northwestern University researchers has developed the first artificial intelligence (AI) to date that can intelligently design robots from scratch.

To test the new AI, the researchers gave the system a simple prompt: Design a robot that can walk across a . While it took nature billions of years to evolve the first walking species, the compressed to lightning speed—designing a successfully walking robot in mere seconds.

But the AI program is not just fast. It also runs on a lightweight and designs wholly novel structures from scratch. This stands in sharp contrast to other AI systems, which often require energy-hungry supercomputers and colossally large datasets. And even after crunching all that data, those systems are tethered to the constraints of human creativity—only mimicking humans’ past works without an ability to generate new ideas.

The advance brings quantum error correction a step closer to reality.

In the future, quantum computers may be able to solve problems that are far too complex for today’s most powerful supercomputers. To realize this promise, quantum versions of error correction codes must be able to account for computational errors faster than they occur.

However, today’s quantum computers are not yet robust enough to realize such error correction at commercially relevant scales.