Archive for the ‘supercomputing’ category: Page 7

Jan 25, 2024

Chemists use blockchain to simulate more than 4 billion chemical reactions essential to origins of life

Posted by in categories: blockchains, chemistry, cryptocurrencies, finance, mathematics, supercomputing

Cryptocurrency is usually “mined” through the blockchain by asking a computer to perform a complicated mathematical problem in exchange for tokens of cryptocurrency. But in research appearing in the journal Chem a team of chemists has repurposed this process, asking computers to instead generate the largest network ever created of chemical reactions which may have given rise to prebiotic molecules on early Earth.

This work indicates that at least some primitive forms of metabolism might have emerged without the involvement of enzymes, and it shows the potential to use blockchain to solve problems outside the financial sector that would otherwise require the use of expensive, hard to access supercomputers.

“At this point we can say we exhaustively looked for every possible combination of chemical reactivity that scientists believe to had been operative on primitive Earth,” says senior author Bartosz A. Grzybowski of the Korea Institute for Basic Science and the Polish Academy of Sciences.

Jan 20, 2024

Supercomputer uses machine learning to set new speed record

Posted by in categories: biotech/medical, genetics, robotics/AI, space travel, supercomputing

Give people a barrier, and at some point they are bound to smash through. Chuck Yeager broke the sound barrier in 1947. Yuri Gagarin burst into orbit for the first manned spaceflight in 1961. The Human Genome Project finished cracking the genetic code in 2003. And we can add one more barrier to humanity’s trophy case: the exascale barrier.

The exascale barrier represents the challenge of achieving exascale-level computing, which has long been considered the benchmark for high performance. To reach that level, however, a computer needs to perform a quintillion calculations per second. You can think of a quintillion as a million trillion, a billion billion, or a million million millions. Whichever you choose, it’s an incomprehensibly large number of calculations.

Continue reading “Supercomputer uses machine learning to set new speed record” »

Jan 17, 2024

Google Scientists Discovered 380,000 New Materials Using Artificial Intelligence

Posted by in categories: economics, robotics/AI, solar power, supercomputing, sustainability

New advancements in technology frequently necessitate the development of novel materials – and thanks to supercomputers and advanced simulations, researchers can bypass the time-consuming and often inefficient process of trial-and-error.

The Materials Project, an open-access database founded at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) in 2011, computes the properties of both known and predicted materials. Researchers can focus on promising materials for future technologies – think lighter alloys that improve fuel economy in cars, more efficient solar cells to boost renewable energy, or faster transistors for the next generation of computers.

Jan 11, 2024

Artificial Intelligence Electricity Use Is In The Crosshairs

Posted by in categories: biotech/medical, cryptocurrencies, robotics/AI, supercomputing

Artificial intelligence has progressed from sci-fi fantasy to mainstream reality. AI now powers online tools from search engines to voice assistants and it is used in everything from medical imaging analysis to autonomous vehicles. But the advance of AI will soon collide with another pressing issue: energy consumption.

Much like cryptocurrencies today, AI risks becoming a target for criticism and regulation based on its high electricity appetite. Partisans are forming into camps, with AI optimists extolling continued progress through more compute power, while pessimists are beginning to portray AI power usage as wasteful and even dangerous. Attacks echo those leveled at crypto mining in recent years. Undoubtedly, there will be further efforts to choke off AI innovation by cutting its energy supply.

The pessimists raise some valid points. Developing ever-more capable AI does require vast computing resources. For example, the amount of compute used to train OpenAI’s ChatGPT-3 reportedly equaled 800 petaflops of processing power—on par with the 20 most powerful supercomputers in the world combined. Similarly, ChatGPT receives somewhere on the order of hundreds of millions of queries each day. Estimates suggest that the electricity required to respond to all these queries might be around 1 GWh daily, enough to power the daily energy consumption of about 33,000 U.S. households. Demand is expected to further increase in the future.

Jan 11, 2024

Gravitas | Artificial Intelligence discovers material to cut Lithium use | WION

Posted by in categories: robotics/AI, supercomputing

In a significant breakthrough, Microsoft and the Pacific Northwest National Laboratory have utilised artificial intelligence and supercomputing to discover a new material that could dramatically reduce lithium use in batteries by up to 70%. This discovery, potentially revolutionising the battery industry, was achieved by narrowing down from 32 million inorganic materials to 18 candidates in just a week, a process that could have taken over 20 years traditionally.

#microsoft #ai #gravitas.

Continue reading “Gravitas | Artificial Intelligence discovers material to cut Lithium use | WION” »

Jan 10, 2024

Quantinuum provides RIKEN large-scale hybrid quantum–supercomputing platform

Posted by in categories: quantum physics, supercomputing

Quantinuum, the world’s largest integrated quantum computing company, and RIKEN, Japan’s largest comprehensive research institution and home to high-performance computing (HPC) center, have announced an agreement…

Jan 10, 2024

New material found by AI could reduce lithium use in batteries

Posted by in categories: robotics/AI, supercomputing

Microsoft said AI and supercomputing were used to synthesise an entirely new material.

Jan 6, 2024

Lawrence Berkeley Lab Researchers Optimize Higher Density Copper Doping to Make LK99 Variant into a Superconductor

Posted by in categories: materials, supercomputing

Lawrence Berkeley National Lab researchers use computational methods to describe an approach for optimizing the LK99 material as a superconductor.

Some will say, hey why is Nextbigfuture still covering LK99. Didn’t some angry scientists say that LK99 was not a superconductor? I have been covering science for over 20 years and there are a lot of angry scientists who believe many things will not work. Scientists going into experiments looking to debunk something will not be the ones who figure out how to make it work.

Lawrence Berkeley National Lab researchers spent time and worked on supercomputers to try to figure out how to make LK99 work. There computational work is showing promise.

Jan 4, 2024

Cyborg computer combining AI and human brain cells really works

Posted by in categories: biological, cyborgs, robotics/AI, supercomputing

A new biohybrid computer combining a “brain organoid” and a traditional AI was able to perform a speech recognition task with 78% accuracy — demonstrating the potential for human biology to one day boost our computing capabilities.

The background: The human brain is the most energy efficient “computer” on Earth — while a supercomputer needs 20 mega watts of power to process more than a quintillion calculations per second, your brain can do the equivalent with just 20 watts (a megawatt is 1 million watts).

This has given researchers the idea to try boosting computers by combining them with a three-dimensional clump of lab-grown human brain cells, known as a brain organoid.

Dec 31, 2023

Google Addresses the Mysteries of Its Hypercomputer

Posted by in categories: quantum physics, robotics/AI, supercomputing

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, “Say what?” It turns out that the Hypercomputer is Google’s take on a modular supercomputer with a healthy dose of its homegrown TPU v5p AI accelerators, which were also announced this month.

The modular design also allows workloads to be sliced up between TPUs and GPUs, with Google’s software tools doing the provisioning and orchestration in the background. Theoretically, if Google were to add a quantum computer to the Google Cloud, it could also be plugged into the Hypercomputer.

While the Hypercomputer was advertised as an AI supercomputer, the good news is that the system also runs scientific computing applications.

Page 7 of 89First4567891011Last