Toggle light / dark theme

NVIDIA’s playing a bigger role in high performance computing than ever, just as supercomputing itself has become central to meeting the biggest challenges of our time.

Speaking just hours ahead of the start of the annual SC18 supercomputing conference in Dallas, NVIDIA CEO Jensen Huang told 700 researchers, lab directors and execs about forces that are driving the company to push both into “scale-up” computing — focused on large supercomputing systems — as well as “scale-out” efforts, for researchers, data scientists and developers to harness the power of however many GPUs they need.

“The HPC industry is fundamentally changing,” Huang told the crowd. “It started out in scientific computing, and the architecture was largely scale up. Its purpose in life was to simulate from first principles the laws of physics. In the future, we will continue to do that, but we have a new tool — this tool is called machine learning.”

Read more

SpiNNaker was built under the leadership of Professor Steve Furber at The University of Manchester, a principal designer of two products that earned the Queen’s Award for Technology —the ARM 32-bit RISC microprocessor, and the BBC Microcomputer.

“The ultimate objective for the project has always been a million cores in a single computer for real time brain modelling applications, and we have now achieved it, which is fantastic.” — Professor Steve Furber, The University of Manchester

Inspired by the human brain, the SpiNNaker is capable of sending billions of small amounts of information simultaneously. The SpiNNaker has a staggering 1 million processors that are able to perform over 200 million actions per second.

Read more

Lattice QCD is not only teaching us how the strong interactions lead to the overwhelming majority of the mass of normal matter in our Universe, but holds the potential to teach us about all sorts of other phenomena, from nuclear reactions to dark matter.

Later today, November 7th, physics professor Phiala Shanahan will be delivering a public lecture from Perimeter Institute, and we&s;ll be live-blogging it right here at 7 PM ET / 4 PM PT. You can watch the talk right here, and follow along with my commentary below. Shanahan is an expert in theoretical nuclear and particle physics and specializes in supercomputer work involving QCD, and I&s;m so curious what else she has to say.

Read more

Imagine building a machine so advanced and precise you need a supercomputer to help design it. That’s exactly what scientists and engineers in Germany did when building the Wendelstein 7-X experiment. The device, funded by the German federal and state governments and the European Union, is a type of fusion device called a stellarator. The new experiment’s goal is to contain a super-heated gas, called plasma, in a donut-shaped vessel using magnets that twist their way around the donut.

Read more

Can the origin of life be explained with quantum mechanics? And if so, are there quantum algorithms that could encode life itself?

We’re a little closer to finding out the answers to those big questions thanks to new research carried out with an IBM supercomputer.

Encoding behaviours related to self-replication, mutation, interaction between individuals, and (inevitably) death, a newly created quantum algorithm has been used to show that quantum computers can indeed mimic some of the patterns of biology in the real world.

Read more

But where advocates like Foxx mostly see the benefits of transhumanism, some critics say it raises ethical concerns in terms of risk, and others point out its potential to exacerbate social inequality.


Foxx says humans have long used technology to make up for physical limitations — think of prosthetics, hearing aids, or even telephones. More controversial technology aimed to enhance or even extend life, like cryogenic freezing, is also charted terrain.

The transhumanist movement isn’t large, but Foxx says there is a growing awareness and interest in technology used to enhance or supplement physical capability.

This is perhaps unsurprising given that we live in an era where scientists are working to create artificial intelligence that can read your mind and millions of people spend most of their day clutching a supercomputer in their hands in the form of a smartphone.

Read more

Japan’s government is facing serious fiscal challenges, but its main science ministry appears hopeful that the nation is ready to once again back basic research in a big way. The Ministry of Education (MEXT) on 31 August announced an ambitious budget request that would allow Japan to compete for the world’s fastest supercomputer, build a replacement x-ray space observatory, and push ahead with a massive new particle detector.


Proposed successor to Super-Kamiokande, exascale computer and x-ray satellite win backing.

Read more

Realistic climate simulations require huge reserves of computational power. An LMU study now shows that new algorithms allow interactions in the atmosphere to be modeled more rapidly without loss of reliability.

Forecasting global and local climates requires the construction and testing of mathematical . Since such models must incorporate a plethora of physical processes and interactions, climate simulations require enormous amounts of . And even the best models inevitably have limitations, since the phenomena involved can never be modeled in sufficient detail. In a project carried out in the context of the DFG-funded Collaborative Research Center “Waves to Weather”, Stephan Rasp of the Institute of Theoretical Meteorology at LMU (Director: Professor George Craig) has now looked at the question of whether the application of can improve the efficacy of climate modelling. The study, which was performed in collaboration with Professor Mike Pritchard of the University of California at Irvine und Pierre Gentine of Columbia University in New York, appears in the journal PNAS.

General circulation models typically simulate the global behavior of the atmosphere on grids whose cells have dimensions of around 50 km. Even using state-of-the-art supercomputers the relevant that take place in the atmosphere are simply too complex to be modelled at the necessary level of detail. One prominent example concerns the modelling of clouds which have a crucial influence on climate. They transport heat and moisture, produce precipitation, as well as absorb and reflect solar radiation, for instance. Many clouds extend over distances of only a few hundred meters, much smaller than the grid cells typically used in simulations – and they are highly dynamic. Both features make them extremely difficult to model realistically. Hence today’s models lack at least one vital ingredient, and in this respect, only provide an approximate description of the Earth system.

Read more