Toggle light / dark theme

Circa 2018 face_with_colon_three


Since the time of Hippocrates and Herophilus, scientists have placed the location of the mind, emotions and intelligence in the brain. For centuries, this theory was explored through anatomical dissection, as the early neuroscientists named and proposed functions for the various sections of this unusual organ. It wasn’t until the late 19th century that Camillo Golgi and Santiago Ramón y Cajal developed the methods to look deeper into the brain, using a silver stain to detect the long, stringy cells now known as neurons and their connections, called synapses.

Today, neuroanatomy involves the most powerful microscopes and computers on the planet. Viewing synapses, which are only nanometers in length, requires an electron microscope imaging a slice of brain thousands of times thinner than a sheet of paper. To map an entire human brain would require 300,000 of these images, and even reconstructing a small three-dimensional brain region from these snapshots requires roughly the same supercomputing power it takes to run an astronomy simulation of the universe.

Fortunately, both of these resources exist at Argonne, where, in 2015, Kasthuri was the first neuroscientist ever hired by the U.S. Department of Energy laboratory. Peter Littlewood, the former director of Argonne who brought him in, recognized that connectome research was going to be one of the great big data challenges of the coming decades, one that UChicago and Argonne were perfectly poised to tackle.

However, in 1973, researchers from the Massachusetts Institute of Technology predicted the end of our civilization with the help of one of the most powerful supercomputers of that time.

In 1973, experts developed a computer program at MIT to model global sustainability. Instead, it predicted that by 2040 our civilization would end.

Recently, that prediction re-appeared in Australian Media, making its way to the rest of the world.

Now, researchers at the Earth Dynamics Research Group and the School of Earth and Planetary Sciences at New Curtin University have used a supercomputer to forecast what could be the likely effect of the movement of the giant tectonic plates.

The formation of continents

Over the past two billion years, the Earth’s continents have collided to form a supercontinent on multiple occasions. Called the supercontinent cycle, this occurs every 600 million years and brings all the continents of the world together.

Once the first artificial super intelligence is created it will help us recursively improve ourselves and then the post human millennium will begin.


Thinking this will prevent war, the US government gives an impenetrable supercomputer total control over launching nuclear missiles. But what the computer does with the power is unimaginable to its creators.

http://www.imdb.com/title/tt0064177/combined

Tesla has unveiled its latest version of its Dojo supercomputer and it’s apparently so powerful that it tripped the power grid in Palo Alto.

Dojo is Tesla’s own custom supercomputer platform built from the ground up for AI machine learning and more specifically for video training using the video data coming from its fleet of vehicles.

The automaker already has a large NVIDIA GPU-based supercomputer that is one of the most powerful in the world, but the new Dojo custom-built computer is using chips and an entire infrastructure designed by Tesla.

New Curtin University-led research has found that the world’s next supercontinent, Amasia, will most likely form when the Pacific Ocean closes in 200 to 300 million years.

Published in National Science Review, the research team used a supercomputer to simulate how a forms and found that because the Earth has been cooling for billions of years, the thickness and strength of the plates under the oceans reduce with time, making it difficult for the next supercontinent to assemble by closing the “young” oceans, such as the Atlantic or Indian oceans.

Lead author Dr. Chuan Huang, from Curtin’s Earth Dynamics Research Group and the School of Earth and Planetary Sciences, said the new findings were significant and provided insights into what would happen to Earth in the next 200 million years.

My head is currently swirling and whirling with a cacophony of conceptions. This maelstrom of meditations was triggered by NVIDIA’s recent announcement of their Jetson Orin Nano system-on-modules that deliver up to 80x the performance over the prior generation, which is, in their own words, “setting a new standard for entry-level edge AI and robotics.”

One of my contemplations centers on their use of the “entry level” qualifier in this context. When I was coming up, this bodacious beauty would have qualified as the biggest, baddest supercomputer on the planet.

I’m being serious. In 1975, which was the year I entered university, Cray Research announced their Cray-1 Supercomputer. Conceived by Seymore Cray, this was the first computer to successfully implement a vector processing architecture.

A new field of science has been emerging at the intersection of neuroscience and high-performance computing — this is the takeaway from the 2022 BrainComp conference, which took place in Cetraro, Italy from the 19th to the 22nd of September. The meeting, which featured international experts in brain mapping, machine learning, simulation, research infrastructures, neuro-derived hardware, neuroethics and more, strengthened the current collaborations in this emerging field and forged new ones.

Now in its 5th edition, BrainComp first started in 2013 and is jointly organised by the Human Brain Project and the EBRAINS digital research infrastructure, University of Calabria in Italy, the Heinrich Heine University of Düsseldorf and the Forschungszentrum Jülich in Germany. It is attended by researchers from inside and outside the Human Brain Project. This year was dedicated to the computational challenges of brain connectivity. The brain is the most complex system in the observable universe due to the tight connections between areas down to the wiring of the individual neurons: decoding this complexity through neuroscientific and computing advances benefits both fields.

Hosted by the organising committee of Katrin Amunts, Scientific Research Director of the HBP, Thomas Lippert, Leader of EBRAINS Computing Services from the Juelich Supercomputing Centre and Lucio Grandinetti from the University of Calabria, the sessions included a variety of topics over four days.