Toggle light / dark theme

Interested in learning what’s next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.

The world of technology is rapidly shifting from flat media viewed in the third person to immersive media experienced in the first person. Recently dubbed “the metaverse,” this major transition in mainstream computing has ignited a new wave of excitement over the core technologies of virtual and augmented reality. But there is a third technology area known as telepresence that is often overlooked but will become an important part of the metaverse.

While virtual reality brings users into simulated worlds, telepresence (also called telerobotics) uses remote robots to bring users to distant places, giving them the ability to look around and perform complex tasks. This concept goes back to science fiction of the 1940s and a seminal short story by Robert A. Heinlein entitled Waldo. If we combine that concept with another classic sci-fi tale, Fantastic Voyage (1966), we can imagine tiny robotic vessels that go inside the body and swim around under the control of doctors who diagnose patients from the inside, and even perform surgical tasks.

View insights.


The existence of ethical concerns is precisely why it’s important for business owners to understand the different technologies driving the Metaverse forward and what impact they may have on users, the environment, and our society. By understanding these technologies, businesses can find new ways to enrich our society with constructive uses of virtual reality connectivity that enrich our world and keep the digital economy booming.

In addition, understanding these technologies is important because as more advanced techniques are developed for use in Metaverse projects, the average cost of US$48,000 for app design in the USA will undoubtedly go up. Business owners need to understand what they need to focus on when planning their next move.

Businesses also need to understand that as the landscape of the Metaverse evolves, the nature of the content will change as well. Creating quality content marketing strategies with these immersive, virtual environments in mind is essential as the industry moves forward.

Swave Photonics has designed holographic chips on a proprietary diffractive optics technology to “bring the metaverse to life.”


Can virtual reality become indistinguishable from actual reality? Swave Photonics, a spinoff of Imec and Vrije Universiteit Brussel, has designed holographic chips on a proprietary diffractive optics technology to “bring the metaverse to life.” The Leuven, Belgium–based startup has raised €7 million in seed funding to accelerate the development of its multi-patented Holographic eXtended Reality (HXR) technology.

“Our vision is to empower people to visualize the impossible, collaborate, and accomplish more,” Théodore Marescaux, CEO and founder of Swave Photonics, told EE Times Europe. “With our HXR technology, we want to make that extended reality practically indistinguishable from the real world.”

What does it mean to project images that are indistinguishable from reality? “It means a very wide field of view, colors, high dynamic range, the ability to move your head around an object and see it from different angles, and the ability to focus,” he said.

Wearable displacement sensors—which are attached to a human body, detect movements in real time and convert them into electrical signals—are currently being actively studied. However, research on tensile-capable displacement sensors has many limitations, such as low tensile properties and complex manufacturing processes.

If a sensor that can be easily manufactured with and tensile properties is developed, it can be attached to a , allowing large movements of joints or fingers to be used in various applications such as AR and VR. A research team led by Sung-Hoon Ahn, mechanical engineering professor at Seoul National University, has developed a piezoelectric strain sensor with high sensitivity and high stretchability based on kirigami design cutting.

In this research, a stretchable piezoelectric displacement sensor was manufactured and its performance was evaluated by applying the kirigami structure to a film-type piezoelectric material. Various sensing characteristics were shown according to the kirigami pattern, and higher sensitivity and tensile properties were shown compared to existing technologies. Wireless haptic gloves using VR technology were produced using the developed sensor, and a piano could be played successfully using them.

Buoyed by research that says inhabiting someone else’s body can change how you perceive your own, researchers have started to investigate the vast, erotic potential of virtual sex.


Despite the fact that sex is a basic instinct and a near-universal experience, we know remarkably little about it. And so, this week, we’re teaming up with our friends at Futurism, oracles of all things science, technology and medicine, to look at the past, present and future of pleasure from a completely scientific perspective.

Sexual psychologist Cathline Smoos spent most of last year having freaky, virtual sex with her two then-boyfriends. Usually, they fooled around together in the trippy, immersive game VRChat, but one day, they decided to explore what it would be like to have sex as objects instead. In their respective bedrooms on different corners of the globe, they strapped on their VR headsets, and embodied two non-human avatars; Smoos chose a chest of drawers, her partner a TV.

Meta’s AI translation work could provide a killer app for AR.


Social media conglomerate Meta has created a single AI model capable of translating across 200 different languages, including many not supported by current commercial tools. The company is open-sourcing the project in the hopes that others will build on its work.

The AI model is part of an ambitious R&D project by Meta to create a so-called “universal speech translator,” which the company sees as important for growth across its many platforms — from Facebook and Instagram, to developing domains like VR and AR. Machine translation not only allows Meta to better understand its users (and so improve the advertising systems that generate 97 percent of its revenue) but could also be the foundation of a killer app for future projects like its augmented reality glasses.

Meet Daniela de Paulis, the newest member of the SETI Institute’s Artist in Residence program. Daniela is a media artist who is also a licensed radio operator and radio telescope operator at the Dwingeloo radio telescope in the Netherlands. Fusing radio technologies, neuroscience, and space research, Daniela creates pioneering art-science projects that include live performances, virtual reality (VR), electroencephalograms (EEG), and audience participation. During this SETI Live chat with SETI AIR Director Bettina Forget, Daniela will discuss her recent works COGITO in Space and OPTICKS, and give us a sneak peek of the project she has planned to complete during her time at the AIR program.