Menu

Blog

Archive for the ‘supercomputing’ category: Page 30

Mar 11, 2022

Supercomputers Simulated a Black Hole And Found Something We’ve Never Seen Before

Posted by in categories: cosmology, supercomputing

While black holes might always be black, they do occasionally emit some intense bursts of light from just outside their event horizon. Previously, what exactly caused these flares had been a mystery to science.

That mystery was solved recently by a team of researchers that used a series of supercomputers to model the details of black holes’ magnetic fields in far more detail than any previous effort. The simulations point to the breaking and remaking of super-strong magnetic fields as the source of the super-bright flares.

Scientists have known that black holes have powerful magnetic fields surrounding them for some time. Typically these are just one part of a complex dance of forces, material, and other phenomena that exist around a black hole.

Mar 6, 2022

Detailed Supercomputer Simulation of the Universe Creates Structures Very Similar to the Milky Way

Posted by in categories: cosmology, evolution, physics, supercomputing

In their pursuit of understanding cosmic evolution, scientists rely on a two-pronged approach. Using advanced instruments, astronomical surveys attempt to look farther and farther into space (and back in time) to study the earliest periods of the Universe. At the same time, scientists create simulations that attempt to model how the Universe has evolved based on our understanding of physics. When the two match, astrophysicists and cosmologists know they are on the right track!

In recent years, increasingly-detailed simulations have been made using increasingly sophisticated supercomputers, which have yielded increasingly accurate results. Recently, an international team of researchers led by the University of Helsinki conducted the most accurate simulations to date. Known as SIBELIUS-DARK, these simulations accurately predicted the evolution of our corner of the cosmos from the Big Bang to the present day.

In addition to the University of Helsinki, the team was comprised of researchers from the Institute for Computational Cosmology (ICC) and the Centre for Extragalactic Astronomy at Durham University, the Lorentz Institute for Theoretical Physics at Leiden University, the Institut d’Astrophysique de Paris, and The Oskar Klein Centre at Stockholm University. The team’s results are published in the Monthly Notices of the Royal Astronomical Society.

Mar 3, 2022

Simulation of a Human-Scale Cerebellar Network Model on the K Computer

Posted by in categories: neuroscience, robotics/AI, supercomputing

Circa 2020 Simulation of the human brain.


Computer simulation of the human brain at an individual neuron resolution is an ultimate goal of computational neuroscience. The Japanese flagship supercomputer, K, provides unprecedented computational capability toward this goal. The cerebellum contains 80% of the neurons in the whole brain. Therefore, computer simulation of the human-scale cerebellum will be a challenge for modern supercomputers. In this study, we built a human-scale spiking network model of the cerebellum, composed of 68 billion spiking neurons, on the K computer. As a benchmark, we performed a computer simulation of a cerebellum-dependent eye movement task known as the optokinetic response. We succeeded in reproducing plausible neuronal activity patterns that are observed experimentally in animals. The model was built on dedicated neural network simulation software called MONET (Millefeuille-like Organization NEural neTwork), which calculates layered sheet types of neural networks with parallelization by tile partitioning. To examine the scalability of the MONET simulator, we repeatedly performed simulations while changing the number of compute nodes from 1,024 to 82,944 and measured the computational time. We observed a good weak-scaling property for our cerebellar network model. Using all 82,944 nodes, we succeeded in simulating a human-scale cerebellum for the first time, although the simulation was 578 times slower than the wall clock time. These results suggest that the K computer is already capable of creating a simulation of a human-scale cerebellar model with the aid of the MONET simulator.

Computer simulation of the whole human brain is an ambitious challenge in the field of computational neuroscience and high-performance computing (Izhikevich, 2005; Izhikevich and Edelman, 2008; Amunts et al., 2016). The human brain contains approximately 100 billion neurons. While the cerebral cortex occupies 82% of the brain mass, it contains only 19% (16 billion) of all neurons. The cerebellum, which occupies only 10% of the brain mass, contains 80% (69 billion) of all neurons (Herculano-Houzel, 2009). Thus, we could say that 80% of human-scale whole brain simulation will be accomplished when a human-scale cerebellum is built and simulated on a computer. The human cerebellum plays crucial roles not only in motor control and learning (Ito, 1984, 2000) but also in cognitive tasks (Ito, 2012; Buckner, 2013). In particular, the human cerebellum seems to be involved in human-specific tasks, such as bipedal locomotion, natural language processing, and use of tools (Lieberman, 2014).

Feb 25, 2022

New simulations refine axion mass, refocusing dark matter search

Posted by in categories: cosmology, physics, supercomputing

Physicists searching—unsuccessfully—for today’s most favored candidate for dark matter, the axion, have been looking in the wrong place, according to a new supercomputer simulation of how axions were produced shortly after the Big Bang 13.6 billion years ago.

Using new calculational techniques and one of the world’s largest computers, Benjamin Safdi, assistant professor of physics at the University of California, Berkeley; Malte Buschmann, a postdoctoral research associate at Princeton University; and colleagues at MIT and Lawrence Berkeley National Laboratory simulated the era when axions would have been produced, approximately a billionth of a billionth of a billionth of a second after the universe came into existence and after the epoch of cosmic inflation.

The at Berkeley Lab’s National Research Scientific Computing Center (NERSC) found the ’s to be more than twice as big as theorists and experimenters have thought: between 40 and 180 microelectron volts (micro-eV, or μeV), or about one 10-billionth the mass of the electron. There are indications, Safdi said, that the mass is close to 65 μeV. Since physicists began looking for the axion 40 years ago, estimates of the mass have ranged widely, from a few μeV to 500 μeV.

Feb 22, 2022

NanoWire Tech Could Usher In a New Age of Supercomputing

Posted by in categories: economics, energy, government, nanotechnology, physics, supercomputing

Building a better supercomputer is something many tech companies, research outfits, and government agencies have been trying to do over the decades. There’s one physical constraint they’ve been unable to avoid, though: conducting electricity for supercomputing is expensive.

Not in an economic sense—although, yes, in an economic sense, too—but in terms of energy. The more electricity you conduct, the more resistance you create (electricians and physics majors, forgive me), which means more wasted energy in the form of heat and vibration. And you can’t let things get too hot, so you have to expend more energy to cool down your circuits.

Feb 16, 2022

The case for techno-optimism: Is the world about to enter an era of mass flourishing?

Posted by in categories: biotech/medical, information science, supercomputing

Instead of relying on a fixed catalogue of available materials or undergoing trial-and-error attempts to come up with new ones, engineers can turn to algorithms running in supercomputers to design unique materials, based on a “materials genome,” with properties tailored to specific needs. Among the new classes of emerging materials are “transient” electronics and bioelectronics that portend applications and industries comparable to the scale that followed the advent of silicon-based electronics.

In each of the three technological spheres, we find the Cloud increasingly woven into the fabric of innovation. The Cloud itself is, synergistically, evolving and expanding from the advances in new materials and machines, creating a virtuous circle of self-amplifying progress. It is a unique feature of our emerging century that constitutes a catalyst for innovation and productivity, the likes of which the world has never seen.

Jan 30, 2022

Quantum Computers Could Crack Bitcoin. Here’s What It Would Take

Posted by in categories: bitcoin, chemistry, cryptocurrencies, cybercrime/malcode, encryption, energy, mathematics, quantum physics, supercomputing

Quantum computers could cause unprecedented disruption in both good and bad ways, from cracking the encryption that secures our data to solving some of chemistry’s most intractable puzzles. New research has given us more clarity about when that might happen.

Modern encryption schemes rely on fiendishly difficult math problems that would take even the largest supercomputers centuries to crack. But the unique capabilities of a quantum computer mean that at sufficient size and power these problems become simple, rendering today’s encryption useless.

That’s a big problem for cybersecurity, and it also poses a major challenge for cryptocurrencies, which use cryptographic keys to secure transactions. If someone could crack the underlying encryption scheme used by Bitcoin, for instance, they would be able to falsify these keys and alter transactions to steal coins or carry out other fraudulent activity.

Jan 30, 2022

The Terrifying Truth Behind Meta’s New Supercomputer

Posted by in categories: robotics/AI, supercomputing

Meta has just revealed their AI Supercomputer which is surpassing any of its competitors in terms of capabilities and performance. Meta AI Research is using data from sites such as Facebook and Instagram to train and improve its models in the hopes of controlling and influencing its users and for other future secret projects. What other dystopian things will come from this, one can only imagine.

TIMESTAMPS:
00:00 Meta’s Secret Weapon.
01:41 The Emergence of AI Supremacy.
04:55 What are Supercomputers used for?
08:03 Is Human AI Possible?
10:34 Last Words.

#meta #supercomputer #dystopia

Jan 28, 2022

Meta’s new AI supercomputer: 16,000 x GPUs, insane 175PB bulk storage

Posted by in categories: robotics/AI, supercomputing

Meta’s new AI Research SuperCluster (RSC) is a metaverse and AI beast — 16,000 GPUs, 16TB/sec training data, 175PB bulk storage.

Jan 27, 2022

Meta Is Making a Monster AI Supercomputer for the Metaverse

Posted by in categories: encryption, information science, internet, robotics/AI, security, supercomputing

Though Meta didn’t give numbers on RSC’s current top speed, in terms of raw processing power it appears comparable to the Perlmutter supercomputer, ranked fifth fastest in the world. At the moment, RSC runs on 6,800 NVIDIA A100 graphics processing units (GPUs), a specialized chip once limited to gaming but now used more widely, especially in AI. Already, the machine is processing computer vision workflows 20 times faster and large language models (like, GPT-3) 3 times faster. The more quickly a company can train models, the more it can complete and further improve in any given year.

In addition to pure speed, RSC will give Meta the ability to train algorithms on its massive hoard of user data. In a blog post, the company said that they previously trained AI on public, open-source datasets, but RSC will use real-world, user-generated data from Meta’s production servers. This detail may make more than a few people blanch, given the numerous privacy and security controversies Meta has faced in recent years. In the post, the company took pains to note the data will be carefully anonymized and encrypted end-to-end. And, they said, RSC won’t have any direct connection to the larger internet.

To accommodate Meta’s enormous training data sets and further increase training speed, the installation will grow to include 16,000 GPUs and an exabyte of storage—equivalent to 36,000 years of high-quality video—later this year. Once complete, Meta says RSC will serve training data at 16 terabytes per second and operate at a top speed of 5 exaflops.

Page 30 of 82First2728293031323334Last