BMW Concept
BMW believes this is the future of innovation and mobility. Autonomous, shape-shifting and augmented-reality guides you.
Posted in augmented reality, robotics/AI
Virtual and augmented reality is taking giant leaps every day, both in the mainstream and in research labs. In a recent TechEmergence interview, Biomedical Engineer and Founder of g.tec Medical Engineering Christopher Guger said the next big steps will be in brain-computer interfaces (BCIs) and embodiment.
If you’re unfamiliar with the term, embodiment is the moment when a person truly “feels” at one with a device controlled by their thoughts, while sensing that device as a part of, or an extension, of themselves. While researchers are taking big strides toward that concept, Guger believes those are only baby steps toward what is to come.
While augmented or virtual reality can take us away for a brief period, Guger said true embodiment will require far more BCI development. There has been a lot of work recently in robotic embodiment using BCI.
“We have the robotic system, which is learning certain tasks. You can train the robotic system to pick up objects, to play a musical instrument and, after the robotic system has learned, you’re just giving the high-level command for the robotic system to do it for you,” he said. “This is like a human being, where you train yourself for a certain task and you have to learn it. You need your cortex and a lot of neurons to do the task. Sometimes, it’s pre-programmed and (sometimes) you’re just making the high-level decision to do it.”
Another tool at work in the study of embodiment is what Guger called “virtual avatars.” These virtual avatars allow researchers to experiment with embodiment and learn both how avatars need to behave, while also helping humans grow more comfortable with the concept of embodiment inside the avatar. Being at ease inside the avatar, he said, makes it easier for one to learn tasks and train, or re-train, for specific functions.
As an example, Guger cited a stroke patient working to regain movement in his hand. Placing the patient into a virtual avatar, the patient can “see” the hand of the avatar moving in the same manner that he wants his own hand to move. This connection activates mirror neurons in the patient’s brain, which helps the brain rewire itself to regain a sense of the hand.
“We also do functional electrical stimulation (where) the hand is electrically stimulated, so you also get the same type of movement. This, altogether, has a very positive effect on the remobilization of the patient,” Guger said. “Your movement and the virtual movement, that’s all feeding back to the artificial systems in the cortex again and is affecting brain plasticity. This helps people learn to recover faster.”
One hurdle that researchers are still working to overcome is the concept of “break in presence” (discussed in the article under the sub-heading ‘head-tracking module’). Basically, this is the moment where one’s immersion in a virtual reality world is interrupted by an outside influence, leading to the loss of embodiment. Avoiding that loss of embodiment, he said, is what researchers are striving to attain to make virtual reality a more effective technology.
Though Guger believes mainstream BCI use and true embodiment is still a ways off, other applications of BCI and embodiment are already happening in the medical field. In addition to helping stroke patients regain their mobility, there are BCI systems that allow doctors to do assessments of brain activity on coma patients, which provides some level of communication for both the patient and the family. Further, ALS patients are able to take advantage of BCI technology to improve their quality of life through virtual movement and communication.
“For the average person on the street, it’s very important that the BCI system is cheap and working, and it has to be faster or better compared to other devices that you might have,” he said. “The embodiment work shows that you can really be embodied in another device; this is only working if you are controlling it mentally, like the body is your own, because you don’t have to steer the keyboard or the mouse. It’s just your body and it’s doing what you want it to do. And then you gain something.”
At Camp Pendleton, Marines are testing a new, cutting edge form of live-fire training using robotic targets.
Stationary and on-rails targets are all well and good, but enemy combatants haven’t behaved like that since the formal battle lines of the Revolutionary War. It’s long past time for a training program that provides a more accurate simulation of today’s combat situations, and the Marines of Camp Pendleton are taking steps toward just that.
The “Autonomous Robotic Human Type Targets” sport wigs and zip around at up to 8 miles per hour on two wheels. Rather than simply popping up and down or riding along a prescribed track, these targets attempt to be as unpredictable and erratic as their real-life counterparts. They’ll change speeds, swerve, and respond to one another when hit. The robots will even advance on Marines in an aggressive response.
I find this all amusing. However, wide spread adoption is a hurdle that has to be addressed first around AI; and at it’s core is the lack of trust by consumers & businesses around technology that still has not eradicated and blocked cyber hacking and attacks.
Martine Rothblatt takes on the notion that AI is dangerous to humanity.
Don’t let the title mislead you — Quantum is not going to require AI to operate or develop it’s computing capabilities. However, what is well known across Quantum communities is that AI will greatly benefit from the processing capabilities & performance of Quantum Computing. There has been a strong interest in marrying the 2 together. However, Quantum maturity gap and timing has not made that possible until recently resulting from the various discoveries in microchip development, programming language (Quipper) development, Q-Dots Silicon wafers, etc.
Researchers at the University of Vienna have created an algorithm that helps plan experiments in this mind-boggling field.
“Watch DeepMind’s program AlphaGo take on the legendary Lee Sedol (9-dan pro), the top Go player of the past decade, in a $1M 5-game challenge match in Seoul.”
“Yet, it’s our emotions and imperfections that makes us human.” –Clyde DeSouza, Memories With Maya.
IMMORTALITY or OBLIVION? I hope that everyone would agree that there are only two possible outcomes after having created Artificial General Intelligence (AGI) for us: immortality or oblivion. The necessity of the beneficial outcome of the coming intelligence explosion cannot be overestimated.
AI can already beat humans in many games, but can AI beat humans in the most important game, the game of life?
Posted in robotics/AI, sex
Comedy.
From director: Michael Polish
Writers: Mark Polish, Mark Polish
Hot Bot is the hilarious journey of two sexually repressed and unpopular teenage geeks who accidentally discover a life-like super-model sex bot (Bardot).
Cast: zack pearlman, doug haley, cynthia kirchner, anthony anderson, donald faison, danny masterson.