“The projects running on Aurora represent some of the most ambitious and innovative science happening today,” said Katherine Riley, ALCF director of science. “From modeling extremely complex physical systems to processing huge amounts of data, Aurora will accelerate discoveries that deepen our understanding of the world around us.”
On the hardware side, Aurora clearly impresses. The supercomputer comprises 166 racks, each holding 64 blades, for a total of 10,624 blades. Each blade contains two Xeon Max processors with 64 GB of HBM2E memory onboard and six Intel Data Center Max ‘Ponte Vecchio’ GPUs, all cooled by a specialized liquid-cooling system.
In total, Aurora has 21,248 CPUs with over 1.1 million high-performance x86 cores, 19.9 PB of DDR5 memory, and 1.36 PB of HBM2E memory attached to the CPUs. It also features 63,744 GPUs optimized for AI and HPC equipped with 8.16 PB of HBM2E memory. Aurora uses 1,024 nodes with solid-state drives for storage, offering 220 PB of total capacity and 31 TB/s of bandwidth. The machine relies on HPE’s Shasta supercomputer architecture with Slingshot interconnects.
We often discuss cybernetic, genetic engineering, artificial intelligence, and hybrids of them, but what truly is synthetic life? And what is it like?
Sign up for a Curiosity Stream subscription and also get a free Nebula subscription (the streaming platform built by creators) here: https://curiositystream.com/isaacarthur. Join this channel to get access to perks: / @isaacarthursfia. Visit our Website: http://www.isaacarthur.net. Join Nebula: https://go.nebula.tv/isaacarthur. Support us on Patreon: / isaacarthur. Support us on Subscribestar: https://www.subscribestar.com/isaac-a… Group: / 1,583,992,725,237,264 Reddit: / isaacarthur Twitter: / isaac_a_arthur on Twitter and RT our future content. SFIA Discord Server: / discord Credits: Synthetic Life Science & Futurism with Isaac Arthur Episode 333a, March 13, 2022 Written, Produced & Narrated by Isaac Arthur Editors: David McFarlane Jason Burbank Jerry Guern Cover Art: Jakub Grygier https://www.artstation.com/jakub_grygier Music Courtesy of Epidemic Sound http://epidemicsound.com/creator. Facebook Group: / 1583992725237264 Reddit: / isaacarthur. Twitter: / isaac_a_arthur on Twitter and RT our future content. SFIA Discord Server: / discord.
Credits: Synthetic Life. Science & Futurism with Isaac Arthur. Episode 333a, March 13, 2022 Written, Produced & Narrated by Isaac Arthur.
Editors: David McFarlane. Jason Burbank. Jerry Guern.
MIT scientists are building ElectroVoxels, small, smart, self-assembling robots designed for space.
It’s programmable matter, infinitely recyclable large-scale 3D printing, if you will, and it could be the future of robotics and machinery in space. In this TechFirst, I chat with MIT PhD student Martin Nisser.
In the future technology may help us enjoy prosperity beyond our dreams, with robots manufacturing our goods and attending all our needs but one… our need for purpose.
The first 1,000 people to use the link will get a free trial of Skillshare Premium Membership: https://skl.sh/isaacarthur04211 Check out Jerry’s story \.
Daniel C. Dennett is the author of Intuition Pumps and Other Tools for Thinking, Breaking the Spell, Freedom Evolves, and Darwin’s Dangerous Idea and is University Professor and Austin B. Fletcher Professor of Philosophy, and Co-Director of the Center for Cognitive Studies at Tufts University. He lives with his wife in North Andover, Massachusetts, and has a daughter, a son, and a grandson. He was born in Boston in 1942, the son of a historian by the same name, and received his B.A. in philosophy from Harvard in 1963. He then went to Oxford to work with Gilbert Ryle, under whose supervision he completed the D.Phil. in philosophy in 1965. He taught at U.C. Irvine from 1965 to 1971, when he moved to Tufts, where he has taught ever since, aside from periods visiting at Harvard, Pittsburgh, Oxford, and the École Normale Supérieure in Paris.
His first book, Content and Consciousness, appeared in 1969, followed by Brainstorms (1978), Elbow Room (1984), The Intentional Stance (1987), Consciousness Explained (1991), Darwin’s Dangerous Idea (1995), Kinds of Minds (1996), and Brainchildren: A Collection of Essays 1984–1996. Sweet Dreams: Philosophical Obstacles to a Science of Consciousness, was published in 2005. He co-edited The Mind’s I with Douglas Hofstadter in 1981 and he is the author of over three hundred scholarly articles on various aspects on the mind, published in journals ranging from Artificial Intelligence and Behavioral and Brain Sciences to Poetics Today and the Journal of Aesthetics and Art Criticism.
Dennett gave the John Locke Lectures at Oxford in 1983, the Gavin David Young Lectures at Adelaide, Australia, in 1985, and the Tanner Lecture at Michigan in 1986, among many others. He has received two Guggenheim Fellowships, a Fulbright Fellowship, and a Fellowship at the Center for Advanced Studies in Behavioral Science. He was elected to the American Academy of Arts and Sciences in 1987.
He was the Co-founder (in 1985) and Co-director of the Curricular Software Studio at Tufts, and has helped to design museum exhibits on computers for the Smithsonian Institution, the Museum of Science in Boston, and the Computer Museum in Boston.
In today’s AI news, ElevenLabs said on Thursday it has raised $180 million in a new funding round that triples the voice cloning artificial intelligence startup’s valuation to $3.3 billion. The Series C funding round was co-led by Andreessen Horowitz and Iconiq Growth, with participation from additional new investors.
On Thursday, OpenAI announced that it is deepening its ties with the US government through a partnership with the National Laboratories and expects to use AI to “supercharge” research across a wide range of fields to better serve the public.
“This is the beginning of a new era, where AI will advance science, strengthen national security, and support US government initiatives,” OpenAI said.
In other advancements, Cerebras Systems announced today it will host DeepSeek’s breakthrough R1 artificial intelligence model on U.S. servers, promising speeds up to 57 times faster than GPU-based solutions while keeping sensitive data within American borders. The move comes amid growing concerns about China’s rapid AI advancement and data privacy.
And, the US Copyright Office issued AI guidance this week that declared no laws need to be clarified when it comes to protecting authorship rights of humans producing AI-assisted works. “Questions of copyrightability and AI can be resolved pursuant to existing law, without the need for legislative change,” the Copyright Office said.
In videos, let’s bust some early myths about DeepSeek. In episode 40 of Mixture of Experts, join host Tim Hwang along with experts Aaron Baughman, Chris Hay and Kate Soule. Last week, we covered the release of DeepSeek-R1; now that the entire world is up to speed, let’s separate the facts from the hype. Next, what is model distillation and why does it matter for competition in AI?
Then, recorded at TEDAI Vienna, AI researcher Youssef Nader and digital archaeologist Julian Schilliger share how they used AI to virtually “unroll” and decode the Herculaneum scrolls, burnt and buried by the eruption of Mount Vesuvius nearly 2,000 years ago. Learn how AI could help decipher a range of artifacts, revealing clues about the mysteries and achievements of the ancient world. And, the native 1080p AI Video from Pika can create hyper-realistic animals. Jerrod Lew is back with another demo of Pika 2.1 generating animals with text-to-video prompts.