Toggle light / dark theme

It’s an everyday scenario: you’re driving down the highway when out of the corner of your eye you spot a car merging into your lane without signaling. How fast can your eyes react to that visual stimulus? Would it make a difference if the offending car were blue instead of green? And if the color green shortened that split-second period between the initial appearance of the stimulus and when the eye began moving towards it (known to scientists as the saccade), could drivers benefit from an augmented reality overlay that made every merging vehicle green?

Qi Sun, a joint professor in Tandon’s Department of Computer Science and Engineering and the Center for Urban Science and Progress (CUSP), is collaborating with neuroscientists to find out.

He and his Ph.D. student Budmonde Duinkharjav—along with colleagues from Princeton, the University of North Carolina, and NVIDIA Research—recently authored the paper “Image Features Influence Reaction Time: A Learned Probabilistic Perceptual Model for Saccade Latency,” presenting a model that can be used to predict temporal gaze behavior, particularly saccadic latency, as a function of the statistics of a displayed image. Inspired by neuroscience, the model could ultimately have great implications for , telemedicine, e-sports, and in any other arena in which AR and VR are leveraged.

“Augmented books, or a-books, can be the future of many book genres, from travel and tourism to education. This technology exists to assist the reader in a deeper understanding of the written topic and get more through digital means without ruining the experience of reading a paper book.”

Power efficiency and pre-printed conductive paper are some of the new features which allow Surrey’s augmented books to now be manufactured on a semi-industrial scale. With no wiring visible to the reader, Surrey’s augmented reality books allow users to trigger with a simple gesture (such as a swipe of a finger or turn of a page), which will then be displayed on a nearby device.

View insights.


In a paper distributed via ArXiv, titled “Exploring the Unprecedented Privacy Risks of the Metaverse,” boffins at UC Berkeley in the US and the Technical University of Munich in Germany play-tested an “escape room” virtual reality (VR) game to better understand just how much data a potential attacker could access. Through a 30-person study of VR usage, the researchers – Vivek Nair (UCB), Gonzalo Munilla Garrido (TUM), and Dawn Song (UCB) – created a framework for assessing and analyzing potential privacy threats. They identified more than 25 examples of private data attributes available to potential attackers, some of which would be difficult or impossible to obtain from traditional mobile or web applications. The metaverse that is rapidly becoming a part of our world has long been an essential part of the gaming community. Interaction-based games like Second Life, Pokemon Go, and Minecraft have existed as virtual social interaction platforms. The founder of Second Life, Philip Rosedale, and many other security experts have lately been vocal about Meta’s impact on data privacy. Since the core concept is similar, it is possible to determine the potential data privacy issues apparently within Meta.

There has been a buzz going around the tech market that by the end of 2022, the metaverse can revive the AR/VR device shipments and take it as high as 14.19 million units, compared to 9.86 million in 2021, indicating a year-over-year increase of about 35% to 36%. The AR/VR device market will witness an enormous boom in the market due to component shortages and the difficulty to develop new technologies. The growth momentum will also be driven by the increased demand for remote interactivity stemming from the pandemic. But what will happen when these VR or metaverse headsets start stealing your precious data? Not just headsets but smart glasses too are prime suspect when it comes to privacy concerns.

Several weeks ago, Facebook introduced a new line of smart glasses called Ray-Ban Stories, which can take photos, shoot 30-second videos, and post them on the owner’s Facebook feed. Priced at US$299 and powered by Facebook’s virtual assistant, the web-connected shades can also take phone calls and play music or podcasts.

Microsoft’s big defense contract that looks to supply the US Army with modified HoloLens AR headsets isn’t going so well. As first reported by Bloomberg, the Senate panel that oversees defense spending announced significant cuts to the Army’s fiscal 2023 procurement request for the device.

Microsoft announced last year it had won a US Army defense contact worth up to $22 billion to develop an Integrated Visual Augmentation System (IVAS), a tactical AR headset for soldiers based on HoloLens 2 technology.

Now the Appropriations Defense Subcommittee announced it’s cut $350 million from the Army’s procurement plans for IVAS, leaving around $50 million for the device. The subcommittee cites concerns based around the program’s overall effectiveness.

Interested in learning what’s next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.

The world of technology is rapidly shifting from flat media viewed in the third person to immersive media experienced in the first person. Recently dubbed “the metaverse,” this major transition in mainstream computing has ignited a new wave of excitement over the core technologies of virtual and augmented reality. But there is a third technology area known as telepresence that is often overlooked but will become an important part of the metaverse.

While virtual reality brings users into simulated worlds, telepresence (also called telerobotics) uses remote robots to bring users to distant places, giving them the ability to look around and perform complex tasks. This concept goes back to science fiction of the 1940s and a seminal short story by Robert A. Heinlein entitled Waldo. If we combine that concept with another classic sci-fi tale, Fantastic Voyage (1966), we can imagine tiny robotic vessels that go inside the body and swim around under the control of doctors who diagnose patients from the inside, and even perform surgical tasks.

Wearable displacement sensors—which are attached to a human body, detect movements in real time and convert them into electrical signals—are currently being actively studied. However, research on tensile-capable displacement sensors has many limitations, such as low tensile properties and complex manufacturing processes.

If a sensor that can be easily manufactured with and tensile properties is developed, it can be attached to a , allowing large movements of joints or fingers to be used in various applications such as AR and VR. A research team led by Sung-Hoon Ahn, mechanical engineering professor at Seoul National University, has developed a piezoelectric strain sensor with high sensitivity and high stretchability based on kirigami design cutting.

In this research, a stretchable piezoelectric displacement sensor was manufactured and its performance was evaluated by applying the kirigami structure to a film-type piezoelectric material. Various sensing characteristics were shown according to the kirigami pattern, and higher sensitivity and tensile properties were shown compared to existing technologies. Wireless haptic gloves using VR technology were produced using the developed sensor, and a piano could be played successfully using them.

Canada-based ecommerce site Shopify has demonstrated an unexpected use of Apple’s RoomPlan API to clear a room before filling it with furniture via Apple AR.

Apple announced RoomPlan at WWDC 2022, but by itself this AR technology will not be any Apple app. Instead, RoomPlan is an API that developers can tap into in order to provide its features as part of their own apps.

It’s been anticipated that retailers could use the RoomPlan API in order to show customers what particular items of, say, furniture would look like in their home. Ikea has already been doing this since shortly after Apple announced ARKit in 2017, but RoomPlan leverages the newer LiDAR technology.