Toggle light / dark theme

China Telecom claims it has built the country’s first supercomputer constructed entirely with Chinese-made components and technology (via ITHome). Based in Wuhan, the Central Intelligent Computing Center supercomputer is reportedly built for AI and can train large language models (LLM) with trillions of parameters. Although China has built supercomputers with domestic hardware and software before, going entirely domestic is a new milestone for the country’s tech industry.

Exact details on the Central Intelligent Computing Center are scarce. What’s clear so far: The supercomputer is purportedly made with only Chinese parts; it can train AI models with trillions of parameters; and it uses liquid cooling. It’s unclear exactly how much performance the supercomputer has. A five-exaflop figure is mentioned in ITHome’s report, but to our eyes it seems that the publication was talking about the total computational power of China Telecom’s supercomputers, and not just this one.

“The memory requirements for PRIYA simulations are so big you cannot put them on anything other than a supercomputer,” Bird said.

TACC awarded Bird a Leadership Resource Allocation on the Frontera supercomputer. Additionally, analysis computations were performed using the resources of the UC Riverside High-Performance Computer Cluster.

The PRIYA simulations on Frontera are some of the largest cosmological simulations yet made, needing over 100,000 core-hours to simulate a system of 30723 (about 29 billion) particles in a ‘box’ 120 megaparsecs on edge, or about 3.91 million light-years across. PRIYA simulations consumed over 600,000 node hours on Frontera.

Tesla is gearing up to build its next-generation Dojo supercomputer at its Gigafactory in Buffalo, New York, as part of a $500 million investment announced by the state’s governor on Friday.

The Dojo supercomputer is designed to process massive amounts of data from Tesla’s vehicles and train its artificial intelligence (AI) systems for autonomous driving and other applications. It is expected to be one of the most powerful computing clusters in the world, surpassing the current leader, NVIDIA.

Cryptocurrency is usually “mined” through the blockchain by asking a computer to perform a complicated mathematical problem in exchange for tokens of cryptocurrency. But in research appearing in the journal Chem a team of chemists has repurposed this process, asking computers to instead generate the largest network ever created of chemical reactions which may have given rise to prebiotic molecules on early Earth.

This work indicates that at least some primitive forms of metabolism might have emerged without the involvement of enzymes, and it shows the potential to use blockchain to solve problems outside the financial sector that would otherwise require the use of expensive, hard to access supercomputers.

“At this point we can say we exhaustively looked for every possible combination of chemical reactivity that scientists believe to had been operative on primitive Earth,” says senior author Bartosz A. Grzybowski of the Korea Institute for Basic Science and the Polish Academy of Sciences.

Give people a barrier, and at some point they are bound to smash through. Chuck Yeager broke the sound barrier in 1947. Yuri Gagarin burst into orbit for the first manned spaceflight in 1961. The Human Genome Project finished cracking the genetic code in 2003. And we can add one more barrier to humanity’s trophy case: the exascale barrier.

The exascale barrier represents the challenge of achieving exascale-level computing, which has long been considered the benchmark for high performance. To reach that level, however, a computer needs to perform a quintillion calculations per second. You can think of a quintillion as a million trillion, a billion billion, or a million million millions. Whichever you choose, it’s an incomprehensibly large number of calculations.

On May 27, 2022, Frontier, a supercomputer built by the Department of Energy’s Oak Ridge National Laboratory, managed the feat. It performed 1.1 quintillion calculations per second to become the fastest computer in the world.

New advancements in technology frequently necessitate the development of novel materials – and thanks to supercomputers and advanced simulations, researchers can bypass the time-consuming and often inefficient process of trial-and-error.

The Materials Project, an open-access database founded at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) in 2011, computes the properties of both known and predicted materials. Researchers can focus on promising materials for future technologies – think lighter alloys that improve fuel economy in cars, more efficient solar cells to boost renewable energy, or faster transistors for the next generation of computers.

Artificial intelligence has progressed from sci-fi fantasy to mainstream reality. AI now powers online tools from search engines to voice assistants and it is used in everything from medical imaging analysis to autonomous vehicles. But the advance of AI will soon collide with another pressing issue: energy consumption.

Much like cryptocurrencies today, AI risks becoming a target for criticism and regulation based on its high electricity appetite. Partisans are forming into camps, with AI optimists extolling continued progress through more compute power, while pessimists are beginning to portray AI power usage as wasteful and even dangerous. Attacks echo those leveled at crypto mining in recent years. Undoubtedly, there will be further efforts to choke off AI innovation by cutting its energy supply.

The pessimists raise some valid points. Developing ever-more capable AI does require vast computing resources. For example, the amount of compute used to train OpenAI’s ChatGPT-3 reportedly equaled 800 petaflops of processing power—on par with the 20 most powerful supercomputers in the world combined. Similarly, ChatGPT receives somewhere on the order of hundreds of millions of queries each day. Estimates suggest that the electricity required to respond to all these queries might be around 1 GWh daily, enough to power the daily energy consumption of about 33,000 U.S. households. Demand is expected to further increase in the future.

In a significant breakthrough, Microsoft and the Pacific Northwest National Laboratory have utilised artificial intelligence and supercomputing to discover a new material that could dramatically reduce lithium use in batteries by up to 70%. This discovery, potentially revolutionising the battery industry, was achieved by narrowing down from 32 million inorganic materials to 18 candidates in just a week, a process that could have taken over 20 years traditionally.

#microsoft #ai #gravitas.

About Channel:

WION The World is One News examines global issues with in-depth analysis. We provide much more than the news of the day. Our aim is to empower people to explore their world. With our Global headquarters in New Delhi, we bring you news on the hour, by the hour. We deliver information that is not biased. We are journalists who are neutral to the core and non-partisan when it comes to world politics. People are tired of biased reportage and we stand for a globalized united world. So for us, the World is truly One.

Please keep discussions on this channel clean and respectful and refrain from using racist or sexist slurs and personal insults.

Check out our website: http://www.wionews.com.