Toggle light / dark theme

Virtual and augmented reality is taking giant leaps every day, both in the mainstream and in research labs. In a recent TechEmergence interview, Biomedical Engineer and Founder of g.tec Medical Engineering Christopher Guger said the next big steps will be in brain-computer interfaces (BCIs) and embodiment.

Image credit: HCI International
Image credit: HCI International

If you’re unfamiliar with the term, embodiment is the moment when a person truly “feels” at one with a device controlled by their thoughts, while sensing that device as a part of, or an extension, of themselves. While researchers are taking big strides toward that concept, Guger believes those are only baby steps toward what is to come.

While augmented or virtual reality can take us away for a brief period, Guger said true embodiment will require far more BCI development. There has been a lot of work recently in robotic embodiment using BCI.

“We have the robotic system, which is learning certain tasks. You can train the robotic system to pick up objects, to play a musical instrument and, after the robotic system has learned, you’re just giving the high-level command for the robotic system to do it for you,” he said. “This is like a human being, where you train yourself for a certain task and you have to learn it. You need your cortex and a lot of neurons to do the task. Sometimes, it’s pre-programmed and (sometimes) you’re just making the high-level decision to do it.”

Another tool at work in the study of embodiment is what Guger called “virtual avatars.” These virtual avatars allow researchers to experiment with embodiment and learn both how avatars need to behave, while also helping humans grow more comfortable with the concept of embodiment inside the avatar. Being at ease inside the avatar, he said, makes it easier for one to learn tasks and train, or re-train, for specific functions.

As an example, Guger cited a stroke patient working to regain movement in his hand. Placing the patient into a virtual avatar, the patient can “see” the hand of the avatar moving in the same manner that he wants his own hand to move. This connection activates mirror neurons in the patient’s brain, which helps the brain rewire itself to regain a sense of the hand.

“We also do functional electrical stimulation (where) the hand is electrically stimulated, so you also get the same type of movement. This, altogether, has a very positive effect on the remobilization of the patient,” Guger said. “Your movement and the virtual movement, that’s all feeding back to the artificial systems in the cortex again and is affecting brain plasticity. This helps people learn to recover faster.”

One hurdle that researchers are still working to overcome is the concept of “break in presence” (discussed in the article under the sub-heading ‘head-tracking module’). Basically, this is the moment where one’s immersion in a virtual reality world is interrupted by an outside influence, leading to the loss of embodiment. Avoiding that loss of embodiment, he said, is what researchers are striving to attain to make virtual reality a more effective technology.

Though Guger believes mainstream BCI use and true embodiment is still a ways off, other applications of BCI and embodiment are already happening in the medical field. In addition to helping stroke patients regain their mobility, there are BCI systems that allow doctors to do assessments of brain activity on coma patients, which provides some level of communication for both the patient and the family. Further, ALS patients are able to take advantage of BCI technology to improve their quality of life through virtual movement and communication.

“For the average person on the street, it’s very important that the BCI system is cheap and working, and it has to be faster or better compared to other devices that you might have,” he said. “The embodiment work shows that you can really be embodied in another device; this is only working if you are controlling it mentally, like the body is your own, because you don’t have to steer the keyboard or the mouse. It’s just your body and it’s doing what you want it to do. And then you gain something.”

Brain-to-brain communication is becoming a reality, says Andrea Stocco, who sees a future where minds meet to share ideas.

You are working on brain-to-brain communication. Can one person’s thoughts ever truly be experienced by another person?

Each brain is different. And while differences in anatomy are relatively easy to account for, differences in function are difficult to characterise. And then we have differences in experience – my idea of flying could be completely unlike your idea of flying, for example. When you think about flying, a bunch of associated experiences come into your mind, competing for your attention. We somehow need to strip away the individual differences to grasp the basic, shared factors.

But it seems possible. Other researchers have been able to use information collected from a group of people to make surprisingly successful, if basic, predictions about what another individual is thinking.

What do you need to transmit information between brains?

A cognitive neuroscientist and his team at HRL Laboratories in Malibu, California, seem to have achieved the impossible.

According to a press release, the team “measured the brain activity patterns of six commercial and military pilots, and then transmitted these patterns into novice subjects as they learned to pilot an airplane in a realistic flight simulator.”

Read more

He pointed out that Horizon Robotics will finish designing its first AI chip for smart home appliances by June and make it commercially available by early 2017.


Mainland Chinese start-up Horizon Robotics, founded by the former head of online search giant Baidu’s Institute of Deep Learning, claims it is on pace to bring chips with built-in artificial intelligence (AI) technology to market.

“General processors are too slow for AI functions. A dedicated chip will dramatically increase the speed of these functions,” Yu Kai, the founder and chief executive of Horizon Robotics told the South China Morning Post.

Founded in Beijing in July, Horizon Robotics is developing chips and software that attempt to mimic how the human brain solves abstract tasks, such as voice and image recognition, that are difficult for regular computer programmes. It also makes sensors for smart devices.

Read more