Toggle light / dark theme

A study in a virtual reality environment found that action video game players have better implicit temporal skills than non-gamers. They are better at preparing to time their reactions in tasks that require quick reactions and they do it automatically, without consciously working on it. The paper was published in Communications Biology.

Many research studies have shown that playing video games enhances cognition. These include increased ability to learn on the fly and improved control of attention. The extent of these improvements is unclear and it also depends on gameplay.

Success in action video games depends on the players’ skill in making precise responses at just the right time. Players benefit from practice during which they refine their time-related expectations of in-game developments, even when they are unaware of it. This largely unconscious process of processing time and preparing to react in a timely manner based on expectations of how the situation the person is in will develop is called incidental temporal processing.

Neuralink’s invasive brain implant vs phantom neuro’s minimally invasive muscle implant. Deep dive on brain computer interfaces, Phantom Neuro, and the future of repairing missing functions.

Connor glass.
Phantom is creating a human-machine interfacing system for lifelike control of technology. We are currently hiring skilled and forward-thinking electrical, mechanical, UI, AR/VR, and Ai/ML engineers. Looking to get in touch with us? Send us an email at [email protected].

Phantom Neuro.
Phantom is a neurotechnology company, spun out of the lab at The Johns Hopkins University School of Medicine, that is enabling lifelike control of robotic orthopedic technologies, such as prosthetic limbs and exoskeletons. Phantom’s solution, the Phantom X, consists of low-risk implantable sensors, AI, and enabling software. By providing superior control of robotic orthopedic mechanisms, the Phantom X will drastically improve the lives of individuals with limb difference who have yet to see a tangible improvement in quality of life despite significant advancements in the field of robotics.

Links:

Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED
AI News Timestamps:
0:00 New AI Robot Dog Beats Human Soccer Skills.
2:34 Breakthrough Humanoid Robotics & AI Tech.
5:21 Google AI Makes HD Video From Text.
8:41 New OpenAI DALL-E Robotics.
11:31 Elon Musk Reveals Tesla Optimus AI Robot.
16:49 Machine Learning Driven Exoskeleton.
19:33 Google AI Makes Video Game Objects From Text.
22:12 Breakthrough Tesla AI Supercomputer.
25:32 Underwater Drone Humanoid Robot.
29:19 Breakthrough Google AI Edits Images With Text.
31:43 New Deep Learning Tech With Light waves.
34:50 Nvidia General Robot Manipulation AI
36:31 Quantum Computer Breakthrough.
38:00 In-Vitro Neural Network Plays Video Games.
39:56 Google DeepMind AI Discovers New Matrices Algorithms.
45:07 New Meta Text To Video AI
48:00 Bionic Tech Feels In Virtual Reality.
53:06 Quantum Physics AI
56:40 Soft Robotics Gripper Learns.
58:13 New Google NLP Powered Robotics.
59:48 Ionic Chips For AI Neural Networks.
1:02:43 Machine Learning Interprets Brain Waves & Reads Mind.

The stimulations were critical for their learning. Separate experiments with DishBrain without any electrical feedback performed far worse.

Game On

The study is a proof of concept that neurons in a dish can be a sophisticated learning machine, and even exhibit signs of sentience and intelligence, said Kagan. That’s not to say they’re conscious—rather, they have the ability to adapt to a goal when “embodied” into a virtual environment.

The designer has equipped the headset with explosive charges.

Palmer Luckey, the guy who co-founded the virtual reality (VR) headset-making company Oculus, has now made another VR headset that can kill you if you die in an online game. Luckey’s company was acquired by Facebook, now Meta, and his product is now a critical component of the metaverse that Mark Zuckerberg plans to build the company around.

At the outset, it might seem that Zuckerberg did the right thing by acquiring Oculus. Otherwise, we would not really know what sort of products they would bring to the market.


Palmer Luckey.

Researchers from Tokyo Metropolitan University have engineered a virtual reality (VR) remote collaboration system which lets users on Segways share not only what they see but also the feeling of acceleration as they move. Riders equipped with cameras and accelerometers can feedback their sensations to a remote user on a modified wheelchair wearing a VR headset. User surveys showed significant reduction in VR sickness, promising a better experience for remote collaboration activities.

Virtual reality (VR) technology is making rapid headway, letting users experience and share an immersive, 3D environment. In the field of remote work, one of the major advances it offers is a chance for workers in different locations to share what they see and hear in real-time.

An example is users on personal mobility devices in large warehouse facilities, factories, and construction sites. Riders can cover large areas with ease while highlighting issues in real-time to a remote co-worker. However, one major drawback can ruin the whole experience: VR sickness is a type of which comes from users seeing “motion” through their headsets without actually moving. Symptoms include headaches, nausea, and sometimes vomiting. The problem is particularly acute for the example above, when the person sharing the experience is moving about.

Since I won’t be posting on Facebook that much in the future. I will leave you with this post, and also hope to see you there, as with Twitter.

Neal Stephenson invented the metaverse. At least from an imagination standpoint. Though other science fiction writers had similar ideas—and the pioneers of VR were already building artificial worlds—Stephenson’s 1992 novel Snow Crash not only fleshed out the vision of escaping to a place where digital displaced the physical, it also gave it a name. That book cemented him as a major writer, and since then he’s had huge success.


Plus: Depicting the nerd mindset; the best lettuce; and the future is flooding.

Virtual reality (VR) and augmented reality (AR) headsets are becoming increasingly advanced, enabling increasingly engaging and immersive digital experiences. To make VR and AR experiences even more realistic, engineers have been trying to create better systems that produce tactile and haptic feedback matching virtual content.

Researchers at University of Hong Kong, City University of Hong Kong, University of Electronic Science and Technology of China (UESTC) and other institutes in China have recently created WeTac, a miniaturized, soft and ultrathin wireless electrotactile system that produces on a user’s skin. This system, introduced in Nature Machine Intelligence, works by delivering through a user’s .

“As the tactile sensitivity among and different parts of the hand within a person varies widely, a universal method to encode tactile information into faithful feedback in hands according to sensitivity features is urgently needed,” Kuanming Yao and his colleagues wrote in their paper. “In addition, existing haptic interfaces worn on the hand are usually bulky, rigid and tethered by cables, which is a hurdle for accurately and naturally providing haptic feedback.”

Gallery QI — Becoming: An Interactive Music Journey in VR — Opening Night.
November 3rd, 2022 — Atkinson Hall auditorium.
UC San Diego — La Jolla, CA

By Shahrokh Yadegari, John Burnett, Eito Murakami and Louis Pisha.

“Becoming” is the result of a collaborative work that was initiated at the Opera Hack organized by San Diego Opera. It is an operatic VR experience based on a Persian poem by Mowlana Rumi depicting the evolution of human spirit. The audience experiences visual, auditory and tactile impressions which are partly curated and partly generated interactively in response to the player’s actions.

“Becoming” incorporates fluid and reactive graphical material which embodies the process of transformation depicted in the Rumi poem. Worlds seamlessly morph between organic and synthetic environments such as oceans, mountains and cities and are populated by continuously evolving life forms. The music is a union of classical Persian music fused with electronic music where the human voice becomes the beacon of spirit across the different stages of the evolution. The various worlds are constructed by the real-time manipulation of particle systems, flocking algorithms and terrain generation methods—all of which can be touched and influenced by the viewer. Audience members can be connected through the network and haptic feedback technology provides human interaction cues as well as an experiential stimulus.