Menu

Blog

Archive for the ‘supercomputing’ category: Page 52

Sep 21, 2019

Ghost post! Google creates world’s most powerful computer, NASA ‘accidentally reveals’ …and then publication vanishes

Posted by in categories: quantum physics, supercomputing

Google’s new quantum computer reportedly spends mere minutes on the tasks the world’s top supercomputers would need several millennia to perform. The media found out about this after NASA “accidentally” shared the firm’s research.

The software engineers at Google have built the world’s most powerful computer, the Financial Times and Fortune magazine reported on Friday, citing the company’s now-removed research paper. The paper is said to have been posted on a website hosted by NASA, which partners with Google, but later quietly taken down, without explanation.

Google and NASA have refused to comment on the matter. A source within the IT giant, however, told Fortune that NASA had “accidentally” published the paper before its team could verify its findings.

Sep 20, 2019

HPE to acquire supercomputer manufacturer Cray for $1.3 billion

Posted by in category: supercomputing

Hewlett Packard Enterprise has reached an agreement to acquire Cray, the manufacturer of supercomputing systems.

HPE says the acquisition will cost $35 per share, in a transaction valued at approximately $1.3 billion, net of cash.

Antonio Neri, president and CEO, HPE, says: Answers to some of society’s most pressing challenges are buried in massive amounts of data.

Sep 13, 2019

Brain-inspired computing could tackle big problems in a small way

Posted by in categories: neuroscience, supercomputing

While computers have become smaller and more powerful and supercomputers and parallel computing have become the standard, we are about to hit a wall in energy and miniaturization. Now, Penn State researchers have designed a 2-D device that can provide more than yes-or-no answers and could be more brainlike than current computing architectures.

“Complexity scaling is also in decline owing to the non-scalability of traditional von Neumann computing architecture and the impending ‘Dark Silicon’ era that presents a severe threat to multi-core processor technology,” the researchers note in today’s (Sept 13) online issue of Nature Communications.

The Dark Silicon era is already upon us to some extent and refers to the inability of all or most of the devices on a computer chip to be powered up at once. This happens because of too much heat generated from a . Von Neumann architecture is the standard structure of most modern computers and relies on a digital approach—” yes” or “no” answers—where program instruction and data are stored in the same memory and share the same communications channel.

Sep 7, 2019

Scientists develop a deep learning method to solve a fundamental problem in statistical physics

Posted by in categories: biotech/medical, robotics/AI, supercomputing

A team of scientists at Freie Universität Berlin has developed an Artificial Intelligence (AI) method that provides a fundamentally new solution of the “sampling problem” in statistical physics. The sampling problem is that important properties of materials and molecules can practically not be computed by directly simulating the motion of atoms in the computer because the required computational capacities are too vast even for supercomputers. The team developed a deep learning method that speeds up these calculations massively, making them feasible for previously intractable applications. “AI is changing all areas of our life, including the way we do science,” explains Dr. Frank Noé, professor at Freie Universität Berlin and main author of the study. Several years ago, so-called deep learning methods bested human experts in pattern recognition—be it the reading of handwritten texts or the recognition of cancer cells from medical images. “Since these breakthroughs, AI research has skyrocketed. Every day, we see new developments in application areas where traditional methods have left us stuck for years. We believe our approach could be such an advance for the field of statistical physics.” The results were published in Science.

Statistical Physics aims at the calculation of properties of materials or molecules based on the interactions of their constituent components—be it a metal’s melting temperature, or whether an antibiotic can bind to the molecules of a bacterium and thereby disable it. With statistical methods, such properties can be calculated in the computer, and the properties of the material or the efficiency of a specific medication can be improved. One of the main problems when doing this calculation is the vast computational cost, explains Simon Olsson, a coauthor of the study: “In principle we would have to consider every single structure, that means every way to position all the atoms in space, compute its probability, and then take their average. But this is impossible because the number of possible structures is astronomically large even for small molecules.

Sep 6, 2019

Secretary Perry Stands Up Office for Artificial Intelligence and Technology

Posted by in categories: biotech/medical, cybercrime/malcode, robotics/AI, supercomputing, sustainability

WASHINGTON, D.C.-Today, U.S. Secretary of Energy Rick Perry announced the establishment of the DOE Artificial Intelligence and Technology Office (AITO). The Secretary has established the office to serve as the coordinating hub for the work being done across the DOE enterprise in Artificial Intelligence. This action has been taken as part of the President’s call for a national AI strategy to ensure AI technologies are developed to positively impact the lives of Americans.

DOE-fueled AI is already being used to strengthen our national security and cybersecurity, improve grid resilience, increase environmental sustainability, enable smarter cities, improve water resource management, as well as speed the discovery of new materials and compounds, and further the understanding, prediction, and treatment of disease. DOE’s National Labs are home to four of the top ten fastest supercomputers in the world, and we’re currently building three next-generation, exascale machines, which will be even faster and more AI-capable computers.

“The world is in the midst of the Golden Age of AI, and DOE’s world class scientific and computing capabilities will be critical to securing America’s dominance in this field,” said Secretary Perry. “This new office housed within the Department of Energy will concentrate our existing efforts while also facilitating partnerships and access to federal data, models and high performance computing resources for America’s AI researchers. Its mission will be to elevate, accelerate and expand DOE’s transformative work to accelerate America’s progress in AI for years to come.”

Aug 28, 2019

AI learns to model our Universe

Posted by in categories: particle physics, robotics/AI, space, supercomputing

Researchers have successfully created a model of the Universe using artificial intelligence, reports a new study.

Researchers seek to understand our Universe by making to match observations. Historically, they have been able to model simple or highly simplified physical systems, jokingly dubbed the “spherical cows,” with pencils and paper. Later, the arrival of computers enabled them to model complex phenomena with . For example, researchers have programmed supercomputers to simulate the motion of billions of particles through billions of years of cosmic time, a procedure known as the N-body simulations, in order to study how the Universe evolved to what we observe today.

“Now with , we have developed the first neural network model of the Universe, and demonstrated there’s a third route to making predictions, one that combines the merits of both analytic calculation and numerical simulation,” said Yin Li, a Postdoctoral Researcher at the Kavli Institute for the Physics and Mathematics of the Universe, University of Tokyo, and jointly the University of California, Berkeley.

Aug 25, 2019

Researchers observe spontaneous occurrence of skyrmions in atomically thin cobalt films

Posted by in categories: particle physics, quantum physics, supercomputing

Since their experimental discovery, magnetic skyrmions—tiny magnetic knots—have moved into the focus of research. Scientists from Hamburg and Kiel have now been able to show that individual magnetic skyrmions with a diameter of only a few nanometers can be stabilized in magnetic metal films even without an external magnetic field. They report on their discovery in the journal Nature Communications.

The existence of magnetic skyrmions as particle-like objects was predicted 30 years ago by , but could only be proven experimentally in 2013. Skyrmions with a diameter from micrometers to a few nanometers were discovered in different magnetic material systems. Although they can be generated on a surface of a few atoms and manipulated with , they show a high stability against external influences. This makes them for future data storage or logic devices. In order to be competitive for technological applications, however, skyrmions must not only be very small, but also stable without an applied magnetic field.

Researchers at the universities of Hamburg and Kiel have now taken an important step in this direction. On the basis of quantum mechanical numerical calculations carried out on the supercomputers of the North-German Supercomputing Alliance (HLRN), the physicists from Kiel were able to predict that individual skyrmions with a diameter of only a few nanometers would appear in an atomically thin, ferromagnetic cobalt film (see Fig. 1). “The stability of the magnetic knots in these films is due to an unusual competition between different magnetic interactions,” says Sebastian Meyer, Ph.D. student in Prof. Stefan Heinze’s research group at the Kiel University.

Aug 13, 2019

Scientists tasked a supercomputer with building millions of simulated universes

Posted by in categories: cosmology, supercomputing

Figuring out how our reality took shape over billions of years is no easy task for scientists. Theories about how the Big Bang played out and the immediate aftermath are a dime a dozen, but researchers led by a team from the University of Arizona think they might stumble upon some of the secrets of galaxy formation by asking a supercomputer to simulate millions of virtual universes and seeing which ones come closest to what we see today.

In a new research paper published in Monthly Notices of the Royal Astronomical Society, the team explains how they used a supercomputer system nicknamed the “Universe Machine” to watch billions of (virtual) years of galaxy formation play out before their eyes.

Aug 12, 2019

Dark matter search yields technique for locating heavy metal seams

Posted by in categories: cosmology, particle physics, supercomputing

A method for locating seams of gold and other heavy metals is the unlikely spin-off of Swinburne’s involvement in a huge experiment to detect dark matter down a mine in Stawell, Victoria.

Associate Professor Alan Duffy, from Swinburne’s Centre for Astrophysics and Supercomputing and a member of the Sodium iodide with Active Background REjection (SABRE) project, said was effectively creating an X-ray of the Earth between the and the surface.

In the mine, the SABRE experiment seeks to detect particles of dark matter, something no one has conclusively achieved yet. Any signal from dark matter would be miniscule, and so the SABRE team created a phenomenally sensitive detector, which, it turns out, is also sensitive to a host of cosmic particles that can help us to locate gold.

Aug 9, 2019

Virtual ‘universe machine’ sheds light on galaxy evolution

Posted by in categories: cosmology, evolution, supercomputing

How do galaxies such as our Milky Way come into existence? How do they grow and change over time? The science behind galaxy formation has remained a puzzle for decades, but a University of Arizona-led team of scientists is one step closer to finding answers thanks to supercomputer simulations.

Observing real galaxies in space can only provide snapshots in time, so researchers who want to study how galaxies evolve over billions of years have to revert to . Traditionally, astronomers have used this approach to invent and test new theories of , one-by-one. Peter Behroozi, an assistant professor at the UA Steward Observatory, and his team overcame this hurdle by generating millions of different universes on a supercomputer, each of which obeyed different physical theories for how galaxies should form.

The findings, published in the Monthly Notices of the Royal Astronomical Society, challenge fundamental ideas about the role dark matter plays in galaxy formation, how galaxies evolve over time and how they give birth to .

Page 52 of 82First4950515253545556Last