Toggle light / dark theme

Amazing.


(credit: iStock)

An international team of scientists has developed an algorithm that represents a major step toward simulating neural connections in the entire human brain.

The new algorithm, described in an open-access paper published in Frontiers in Neuroinformatics, is intended to allow simulation of the human brain’s 100 billion interconnected neurons on supercomputers. The work involves researchers at the Jülich Research Centre, Norwegian University of Life Sciences, Aachen University, RIKEN, KTH Royal Institute of Technology, and KTH Royal Institute of Technology.

An open-source neural simulation tool. The algorithm was developed using NEST (“neural simulation tool”) — open-source simulation software in widespread use by the neuroscientific community and a core simulator of the European Human Brain Project. With NEST, the behavior of each neuron in the network is represented by a small number of mathematical equations, the researchers explain in an announcement.

An international group of researchers has made a decisive step towards creating the technology to achieve simulations of brain-scale networks on future supercomputers of the exascale class. The breakthrough, published in Frontiers in Neuroinformatics, allows larger parts of the human brain to be represented, using the same amount of computer memory. Simultaneously, the new algorithm significantly speeds up brain simulations on existing supercomputers.

The human brain is an organ of incredible complexity, composed of 100 billion interconnected nerve cells. However, even with the help of the most powerful supercomputers available, it is currently impossible to simulate the exchange of neuronal signals in networks of this size.

“Since 2014, our software can simulate about one percent of the in the human brain with all their connections,” says Markus Diesmann, Director at the Jülich Institute of Neuroscience and Medicine (INM-6). In order to achieve this impressive feat, the software requires the entire main memory of petascale supercomputers, such as the K computer in Kobe and JUQUEEN in Jülich.

Read more

How far should we integrate human physiology with technology? What do we do with self-aware androids—like Blade Runner’s replicants—and self-aware supercomputers? Or the merging of our brains with them? If Ray Kurzweil’s famous singularity—a future in which the exponential growth of technology turns into a runaway train—becomes a reality, does religion have something to offer in response?


Yes, not only is A.I. potentially taking all of our jobs, but it’s also changing religion.

Brandon WithrowBrandon Withrow

Read more

“Scientists and philosophers… had always assumed that the world worked by physical laws, and if you could measure initial conditions accurately enough, those laws would let you predict the future indefinitely. As James Gleick described it in his book Chaos: Making a New Science, this view was very wrong.”

“There was always one small compromise, so small that working scientists usually forgot it was there, lurking in a corner of their philosophies like an unpaid bill. Measurements could never be perfect,” he wrote. “Scientists marching under Newton’s banner actually waved another flag that said something like this: Given an approximate knowledge of a system’s initial conditions and an understanding of natural law, one can calculate the approximate behaviour of the system. This assumption lay at the philosophical heart of science.”

“Today we know how wrong this assumption was. The Three Body Problem is now recognized as a classic example of a chaotic system. Like the butterfly that causes a hurricane by flapping its wings, it is exquisitely sensitive to initial conditions. The tiniest tweak can have massive consequences down the line.”


Like the endlessly repeating patterns of chaos theory, the new solutions discovered by the Chinese researchers make for elaborate and weirdly beautiful images when they are plotted in two dimensions. They are unlikely to have ever existed in reality, however. Because of how solar systems form, planets, moons and stars tend to settle into regular orbits on a single plane.

The Chinese researchers credit their discoveries to advancements in computer science and their novel technique called clean numerical simulation, which is a strategy for modelling chaotic systems, in which solutions are reached indirectly by continuous refinement, rather than directly by brute calculation.

“In this paper, we numerically obtain 695 families of Newtonian periodic planar collisionless orbits of three-body system with equal mass and zero angular momentum in case of initial conditions with isosceles collinear configuration, including the well-known figure-eight family found by Moore in 1993, the 11 families found by Šuvakov and Dmitrašinović in 2013, and more than 600 new families that have never been reported, to the best of our knowledge,” reads the report by XiaoMing Li and ShiJun Liao.

SPONSOR CONTENT: The U.S. Department of Energy tasked six major computing companies with researching and developing an exascale supercomputer.

With the ability to run a quintillion calculations per second—that’s a one with eighteen zeros after it—the implications of an exascale computer would touch nearly every facet of our lives, and would provide the opportunity to potentially solve humanity’s most pressing problems. http://theatln.tc/2xc7QLn

Read more

Aleksandr Noy has big plans for a very small tool. A senior research scientist at Lawrence Livermore National Laboratory, Noy has devoted a significant part of his career to perfecting the liquid alchemy known as desalination—removing salt from seawater. His stock-in-trade is the carbon nanotube. In 2006, Noy had the audacity to embrace a radical theory: Maybe nanotubes—cylinders so tiny, they can be seen only with an electron microscope—could act as desalination filters. It depended on just how wide the tubes were. The opening needed to be big enough to let water molecules flow through but small enough to block the larger salt particles that make seawater undrinkable. Put enough carbon nanotubes together and you potentially have the world’s most efficient machine for making clean water.

Read more

When one of the first personal computers, the Altair 8800 came along in 1976, Microsoft was ready with a programming language, Altair BASIC. It wants to be equally prepared when quantum computers go mainstream, so it has unveiled a new programming language and other tools for the futuristic tech at its Ignite conference. You’ll still need to understand Qubits and other weird concepts, but by integrating traditional languages like C# and Python, Microsoft will make it easier to do mainstream computing on the complex machines.

Quantum computing is famously difficult to grasp — even IBM’s “Beginner’s Guide” is laughingly opaque. In discussing Microsoft’s new initiatives, Bill Gates called the physics “hieroglyphics,” and when asked if he could describe it in one sentence, Satya Nadella said “I don’t think so. I wish I could.”

So, let’s just talk about what it can do, then. By taking advantage of the principles of superposition and entanglement, quantum computers can solve certain types of problems exponentially faster than the best supercomputers. “It would allow scientists to do computations in minutes or hours that would take the lifetime of the universe on even the most advanced classical computers,” Microsoft explains. “That, in turn, would mean that people could find answers to scientific questions previously thought unanswerable.”

Read more