Toggle light / dark theme

Interesting research paper on motor cortex-based brain-computer interface (BCI) research conducted by researchers from UW. Sharing with fellow partners and researchers trying to advance BMI as well as those researching and/ or re-creating brain/ neuro patterns in systems.


The neurons in the human brain are densely interlaced, sharing upwards of 100 trillion physical connections. It is widely theorized that this tremendous connectivity is one of the facets of our nervous system that enables human intelligence. In this study, over the course of a week, human subjects learned to use electrical activity recorded directly from the surface of their brain to control a computer cursor. This provided us an opportunity to investigate patterns of interactivity that occur in the brain during the development of a new skill. We demonstrated two fundamentally different forms of interactions, one spanning only neighboring populations of neurons and the other covering much longer distances across the brain. The short-distance interaction type was notably stronger during early phases of learning, lessening with time, whereas the other was not. These findings point to evidence of multiple different forms of task-relevant communication taking place between regions in the human brain, and serve as a building block in our efforts to better understand human intelligence.

Citation: Wander JD, Sarma D, Johnson LA, Fetz EE, Rao RPN, Ojemann JG, et al. (2016) Cortico-Cortical Interactions during Acquisition and Use of a Neuroprosthetic Skill. PLoS Comput Biol 12: e1004931. doi:10.1371/journal.pcbi.1004931

Editor: Olaf Sporns, Indiana University, UNITED STATES

Read more

Luv this.


Smart devices implanted in the body have thus far not been able to communicate via Wi-Fi due to the power requirements of such communications. Surgery is required when the battery in a brain stimulator or a pacemaker needs to be replaced. Not only is this expensive, but any surgery has inherent risks and could lead to complications. It is therefore critically important that the battery life in implanted medical devices be preserved for as long as possible.

Other constraints limiting how much power a device can use include their location in the body and their size. New emerging devices that could one day reanimate limbs, stimulate organs, or brain implants that treat Parkinson’s disease are limited by the same factors.

Smartwatches, smartphones and other similar Bluetooth enable devices continuously transmit communication signals. A team from the University of Washington (UW) consisting of computer scientists and electrical engineers, have developed a method that utilizes these signals and converts it to Wi-Fi signals. The new method uses ten thousand times less energy than traditional methods do. Another huge advantage of this method is that it does not need any specialized equipment.

Read more

Another great example where scientists are bridging bio and technology together.


Fluorescent proteins from jellyfish that were grown in bacteria have been used to create a laser for the first time, according to a new study.

The breakthrough represents a major advance in so-called polariton lasers, the researchers said. These lasers have the potential to be far more efficient and compact than conventional ones and could open up research avenues in quantum physics and optical computing, the researchers said.

Traditional polariton lasers using inorganic semiconductors need to be cooled to incredibly low temperatures. More recent designs based on organic electronics materials, like those used in organic light-emitting diode (OLED) displays, operate at room temperature but need to be powered by picosecond (one-trillionth of a second) pulses of light. [Science Fact or Fiction? The Plausibility of 10 Sci-Fi Concepts].

Read more

Quantum computing 101 — lesson 1: quantum models


Before reviewing in more detail the most promising experimental realisations of quantum information processors, I think it is useful to recap the basic concepts and most used models of quantum computing. In particular, the models, as the physical realisations mentioned in a previous post use different but equivalent computational models, which need to be understood to comprehend their implementations.

Read more

Since QUESS has been online, China has been able to deliver the 1st set of programmable code, transmit communications back-and-forth from the satellite, and now they have been able to expand the memory capacity up to 100 Qubits. These are pretty big steps since the satellite has been in orbit on Tuesday.

BTW — the 1st 2 events are directly a result of QUESS; the 3rd advancement isn’t the result of QUESS and resulted after QUESS’ launch.


Although Chinese scientists said there is still a long way to go before any ultrapowerful machine can be developed, progress has been made in terms of quantum memory technology, which is a key component to quantum computing and quantum communication.

On Tuesday, China launched the world’s first quantum experimental satellite in an attempt to build a space-based quantum communication network.

Zhou Zongquan, a scientist in the field, told China Daily that following the breakthrough in 2011 when scientists at the University of Science and Technology of China developed the world’s first quantum memory of 1 quantum bit, or qubit, they have now developed a memory of 100 qubits.

Read more

Imagine this scenario: Annual physical examinations are supplemented by an affordable home diagnostic chip, allowing you to regularly monitor your baseline health with just a simple urine sample. Though outwardly you appear to be in good health, the device reveals a fluctuation in your biomarker profile, indicating the possible emergence of early stage cancer development or presence of a virus.

Diagnostic devices like a home pregnancy test have been around since the 1970s. It revolutionized a woman’s ability to find out if she was pregnant without having to wait for a doctor’s appointment to confirm her suspicions. The test relies on detecting a hormone, human chorionic gonadotropin, present in urine. But could detecting cancer, or a deadly virus, from a similar kind of sample and device be as simple and non-invasive?

Read more

Experiments confirm the existence of 1-micrometer-sized molecules made of two cesium atoms by showing that their binding energies agree with predictions.

Strongly bound diatomic molecules such as H2H2or O2O2 are less than a nanometer across. Surprisingly, scientists have been able to create two-atom molecules more than a thousand times larger by using exotic atoms that attract one another only very weakly. Now, a pair of physicists have calculated what makes these “macrodimers” stable, and they have verified their predictions by creating micrometer-sized molecules containing two cesium atoms. The macrodimers could have applications in quantum computing.

Read more

The blog take away: How is the key frequency of beta oxidation made in a mitochondria?Most people believe fat burning via beta oxidation is a fuel mediated mechanism but Dr. Doug Wallace’s data strongly suggests it is linked to the vibration state of the inner mitochondrial membrane. If so, how is the sun’s photoelectric abilities critical to this mechanism in mitochondria? Watch the video in the hyperlink closely from 50:00 – 59:00 for the clue.

Hyperlink

Water surrounds each mitochondria in a cell with its MINOS layer. It is adjacent to the cytochrome 1 complex. Water has a high dielectric constant. It is 78 in bulk water, to be exact, Why is that critical? Well cytochrome one has a redox Fe-S couple that acts like a semiconductor for electrons. Electrons act differently in a semiconductor than they do when they are not captured by one. How much do you know about semiconductor integrated circuits? In a typical network in an integrated circuit, each network will include at least one driver, which must contain a source or drain diffusion and at least one receiver. This set up will consist of a gate electrode over a thin gate dielectric (look for a view of a MOS transistor on line if you’re unsure of this arrangement to get a visual.)

Read more

(Phys.org)—When you hear a sound, only some of the neurons in the auditory cortex of your brain are activated. This is because every auditory neuron is tuned to a certain range of sound, so that each neuron is more sensitive to particular types and levels of sound than others. In a new study, researchers have designed a neuromorphic (“brain-inspired”) computing system that mimics this neural selectivity by using artificial level-tuned neurons that preferentially respond to specific types of stimuli.

In the future, level-tuned neurons may help enable systems to perform tasks that traditional computers cannot, such as learning from their environment, pattern recognition, and knowledge extraction from big data sources.

The researchers, Angeliki Pantazi et al., at IBM Research-Zurich and École Polytechnique Fédérale de Lausanne, both in Switzerland, have published a paper on the new neuromorphic architecture in a recent issue of Nanotechnology.

Read more