Toggle light / dark theme

Imagine building a machine so advanced and precise you need a supercomputer to help design it. That’s exactly what scientists and engineers in Germany did when building the Wendelstein 7-X experiment. The device, funded by the German federal and state governments and the European Union, is a type of fusion device called a stellarator. The new experiment’s goal is to contain a super-heated gas, called plasma, in a donut-shaped vessel using magnets that twist their way around the donut.

Read more

Can the origin of life be explained with quantum mechanics? And if so, are there quantum algorithms that could encode life itself?

We’re a little closer to finding out the answers to those big questions thanks to new research carried out with an IBM supercomputer.

Encoding behaviours related to self-replication, mutation, interaction between individuals, and (inevitably) death, a newly created quantum algorithm has been used to show that quantum computers can indeed mimic some of the patterns of biology in the real world.

Read more

But where advocates like Foxx mostly see the benefits of transhumanism, some critics say it raises ethical concerns in terms of risk, and others point out its potential to exacerbate social inequality.


Foxx says humans have long used technology to make up for physical limitations — think of prosthetics, hearing aids, or even telephones. More controversial technology aimed to enhance or even extend life, like cryogenic freezing, is also charted terrain.

The transhumanist movement isn’t large, but Foxx says there is a growing awareness and interest in technology used to enhance or supplement physical capability.

This is perhaps unsurprising given that we live in an era where scientists are working to create artificial intelligence that can read your mind and millions of people spend most of their day clutching a supercomputer in their hands in the form of a smartphone.

Japan’s government is facing serious fiscal challenges, but its main science ministry appears hopeful that the nation is ready to once again back basic research in a big way. The Ministry of Education (MEXT) on 31 August announced an ambitious budget request that would allow Japan to compete for the world’s fastest supercomputer, build a replacement x-ray space observatory, and push ahead with a massive new particle detector.


Proposed successor to Super-Kamiokande, exascale computer and x-ray satellite win backing.

Read more

Realistic climate simulations require huge reserves of computational power. An LMU study now shows that new algorithms allow interactions in the atmosphere to be modeled more rapidly without loss of reliability.

Forecasting global and local climates requires the construction and testing of mathematical . Since such models must incorporate a plethora of physical processes and interactions, climate simulations require enormous amounts of . And even the best models inevitably have limitations, since the phenomena involved can never be modeled in sufficient detail. In a project carried out in the context of the DFG-funded Collaborative Research Center “Waves to Weather”, Stephan Rasp of the Institute of Theoretical Meteorology at LMU (Director: Professor George Craig) has now looked at the question of whether the application of can improve the efficacy of climate modelling. The study, which was performed in collaboration with Professor Mike Pritchard of the University of California at Irvine und Pierre Gentine of Columbia University in New York, appears in the journal PNAS.

General circulation models typically simulate the global behavior of the atmosphere on grids whose cells have dimensions of around 50 km. Even using state-of-the-art supercomputers the relevant that take place in the atmosphere are simply too complex to be modelled at the necessary level of detail. One prominent example concerns the modelling of clouds which have a crucial influence on climate. They transport heat and moisture, produce precipitation, as well as absorb and reflect solar radiation, for instance. Many clouds extend over distances of only a few hundred meters, much smaller than the grid cells typically used in simulations – and they are highly dynamic. Both features make them extremely difficult to model realistically. Hence today’s models lack at least one vital ingredient, and in this respect, only provide an approximate description of the Earth system.

Read more

An AI is set to try and work out how a potentially limitless supply of energy can be used on Earth.

It could finally solve the mysteries of fusion power, letting researchers capture and control the process that powers the sun and stars.

Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) and Princeton University hope to harness a massive new supercomputer to work out how the doughnut-shaped devices, known as tokamaks, can be used.

Read more

Researchers involved in the Blue Brain Project – which aims to create a digital reconstruction of the brain – have announced the deployment of a next-generation supercomputer.

mouse brain supercomputer future
Credit: HPE

Ecole Polytechnique Fédérale de Lausanne (EPFL), the Swiss university and research institute developing the Blue Brain Project, has announced the selection of Hewlett Packard Enterprise (HPE) to build a next-generation supercomputer. This will model and simulate the mammalian brain in greater detail than ever before. The powerful new machine, called “Blue Brain 5”, will be dedicated to simulation neuroscience, in particular simulation-based research, analysis and visualisation, to advance the understanding of the brain.

Read more

An international team of scientists from Eindhoven University of Technology, University of Texas at Austin, and University of Derby, has developed a revolutionary method that quadratically accelerates artificial intelligence (AI) training algorithms. This gives full AI capability to inexpensive computers, and would make it possible in one to two years for supercomputers to utilize Artificial Neural Networks that quadratically exceed the possibilities of today’s artificial neural networks. The scientists presented their method on June 19 in the journal Nature Communications.

Artificial Neural Networks (or ANN) are at the very heart of the AI revolution that is shaping every aspect of society and technology. But the ANNs that we have been able to handle so far are nowhere near solving very complex problems. The very latest supercomputers would struggle with a 16 million-neuron network (just about the size of a frog brain), while it would take over a dozen days for a powerful desktop computer to train a mere 100,000-neuron network.

Read more