Toggle light / dark theme

Johns Hopkins University researchers are the first to glimpse the human brain making a purely voluntary decision to act.

Unlike most studies where scientists watch as people respond to cues or commands, Johns Hopkins researchers found a way to observe people’s as they made choices entirely on their own. The findings, which pinpoint the parts of the brain involved in and action, are now online, and due to appear in a special October issue of the journal Attention, Perception, & Psychophysics.

“How do we peek into people’s brains and find out how we make choices entirely on our own?” asked Susan Courtney, a professor of psychological and brain sciences. “What parts of the brain are involved in free choice?”

Read more

Like this feature on QC.


If you have trouble wrapping your mind around quantum physics, don’t worry — it’s even hard for supercomputers. The solution, according to researchers from Google, Harvard, Lawrence Berkeley National Laboratories and others? Why, use a quantum computer, of course. The team accurately predicted chemical reaction rates using a supercooled quantum circuit, a result that could lead to improved solar cells, batteries, flexible electronics and much more.

Chemical reactions are inherently quantum themselves — the team actually used a quote from Richard Feynman saying “nature isn’t classical, dammit.” The problem is that “molecular systems form highly entangled quantum superposition states, which require many classical computing resources in order to represent sufficiently high precision,” according to the Google Research blog. Computing the lowest energy state for propane, a relatively simple molecule, takes around ten days, for instance. That figure is required in order to get the reaction rate.

That’s where the “Xmon” supercooled qubit quantum computing circuit (shown above) comes in. The device, known as a “variational quantum eigensolver (VQE)” is the quantum equivalent of a classic neural network. The difference is that you train a classical neural circuit (like Google’s DeepMind AI) to model classical data, and train the VQE to model quantum data. “The quantum advantage of VQE is that quantum bits can efficiently represent the molecular wave function, whereas exponentially many classical bits would be required.”

Read more

Although BMI is nothing new; I never get tired of highlighting it.


Now the group has come up with a way for one person to control multiple robots.

The system works using one controller who watches the drones, while his thoughts are read using a computer.

The controller wears a skull cap fitted with 128 electrodes wired to a computer. The device records electrical brain activity. If the controller moves a hand or thinks of something, certain areas light up.

Read more

In the campy 1966 science fiction movie “Fantastic Voyage,” scientists miniaturize a submarine with themselves inside and travel through the body of a colleague to break up a potentially fatal blood clot. Right. Micro-humans aside, imagine the inflammation that metal sub would cause.

Ideally, injectable or implantable medical devices should not only be small and electrically functional, they should be soft, like the body tissues with which they interact. Scientists from two UChicago labs set out to see if they could design a material with all three of those properties.

The material they came up with, published online June 27, 2016, in Nature Materials, forms the basis of an ingenious light-activated injectable device that could eventually be used to stimulate nerve cells and manipulate the behavior of muscles and organs.

Read more

Inspired by the large-scale sky surveys with which astronomers explore the cosmos, neuroscientists in Seattle, Washington, have spent four years systematically surveying the neural activity of the mouse visual cortex. The Allen Brain Observatory’s first data release, on 13 July, provides a publicly accessible data set of unprecedented size and scope, designed to help scientists to model and understand the human brain.

The project is part of an ambitious ten-year brain-research plan announced in 2012 by the Allen Institute for Brain Science. Designed to catalogue neurons and their electrical characteristics in minute detail, the initiative aims to enable new insights into how perception and cognition arise.

To compile the brain observatory’s first data set, researchers used a specialized microscope to record calcium waves that occur when neurons fire, sampling activity in 25 mice over 360 experimental sessions, while the animals viewed a battery of visual stimuli such as moving patterns of lines, images of natural scenes and short movies. The data set so far includes 18,000 cells in 4 areas of the visual cortex, making it one of the largest and most comprehensive of its kind. The set also includes information about each neuron’s location and its expression of certain genetic markers. At 30 terabytes, the raw data are too large to share easily, but users can download a more manageable processed data set, or explore it online.

Read more

About 5 years ago a friend of mine at Microsoft (Mitch S.) had a vision of making a new security model around drone swarms and a form of BMI technology. Glad to see the vision come true.


Scientists have discovered how to control multiple robotic drones using the human brain, an advance that can help develop swarms of search and rescue drones that are controlled just by thought.

A controller wears a skull cap outfitted with 128 electrodes wired to a computer. The device records electrical brain activity. If the controller moves a hand or thinks of something, certain areas light up. “I can see that activity from outside. Our goal is to decode that activity to control variables for the robots,” said Panagiotis Artemiadis, from the Arizona State University in the US. If the user is thinking about spreading the drones out, we know what part of the brain controls that thought, Artemiadis said.

A wireless system sends the thought to the robots. “We have a motion-capture system that knows where the quads are, and we change their distance,” he said. Up to four small robots, some of which fly, can be controlled with brain interfaces. To make them move, the controller watches on a monitor and thinks and pictures the drones performing various tasks.

Read more

Researchers at Case Western Reserve University have combined tissues from a sea slug with flexible 3D printed components to build “biohybrid” robots that crawl like sea turtles on the beach.

A muscle from the slug’s mouth provides the movement, which is currently controlled by an external electrical field. However, future iterations of the device will include ganglia, bundles of neurons and nerves that normally conduct signals to the muscle as the slug feeds, as an organic controller.

The researchers also manipulated collagen from the slug’s skin to build an organic scaffold to be tested in new versions of the robot.

Read more