Toggle light / dark theme

Spatial has raised $25 million as it pivots away from augmented reality and virtual reality collaboration to nonfungible token (NFT) art exhibitions and metaverse events.

Spatial started out by providing AR/VR meeting places that people could access with AR glasses, VR headsets, and smartphones. But it found with the NFT art boom that it could provide a way for people to easily view digital art in virtual galleries, said Jake Steinerman, head of community at Spatial, in an interview with GamesBeat.

“We changing our direction,” said Steinerman.

Swiss technology company WayRay has unveiled what it says is the world’s first car to incorporate holographic, augmented reality glazing – the Holograktor. The innovation is designed around the company’s True AR technology and intended to operate with WayRay’s new ride-hailing business model. The system’s USP is its ability to render augmented reality scenes around the vehicle in real time, displayed via holographic projections.

Backed by early investments from companies like Porsche, Hyundai and Alibaba, WayRay says it is using the car to emerge from its ‘deep tech’ automotive supplier status to become a player in the world of new mobility models.

The three-seat vehicle has been conceived specifically for ride hailing and can be driven conventionally or by remote control, in the latter case via a 5G and satellite connection to a qualified driver. Its unusual single rear seat ‘throne’ layout was inspired by data showing that more than 80% of Uber trips were for one person only. “The idea is that you can choose Uber Black, Uber SUV or Uber Holograktor. And if you choose the Holograktor, your ride will be subsidized by sponsored content so that the price will be much lower,” said WayRay founder and CEO, Vitaly Ponomarev.

In the novel-turned-movie Ready Player One by Ernest Cline, the protagonist escapes to an online realm aptly called OASIS. Instrumental to the OASIS experience is his haptic (relating to sense of touch) bodysuit, which enables him to move through and interact with the virtual world with his body. He can even activate tactile sensations to feel every gut punch, or a kiss from a badass online girl.

While no such technology is commercially available yet, the platform Meta, formerly known as Facebook, is in the early stages of creating haptic gloves to bring the virtual world to our fingertips. These gloves have been in the works for the past seven years, the company recently said, and there’s still a few more to go.

These gloves would allow the wearer to not only interact with and control the virtual world, but experience it in a way similar to how one experiences the physical world. The wearer would use the gloves in tandem with a headset for AR or VR. A video posted by Meta in a blog shows two users having a remote thumb-wrestling match. In their VR headsets, they see a pair of disembodied hands reflecting the motions that their own hands are making. In their gloves, they feel every squeeze and twitch of their partner’s hand—at least that’s the idea.

Gesture interface company Leap Motion is announcing an ambitious, but still very early, plan for an augmented reality platform based on its hand tracking system. The system is called Project North Star, and it includes a design for a headset that Leap Motion claims costs less than $100 at large-scale production. The headset would be equipped with a Leap Motion sensor, so users could precisely manipulate objects with their hands — something the company has previously offered for desktop and VR displays.

Project North Star isn’t a new consumer headset, nor will Leap Motion be selling a version to developers at this point. Instead, the company is releasing the necessary hardware specifications and software under an open source license next week. “We hope that these designs will inspire a new generation of experimental AR systems that will shift the conversation from what an AR system should look like, to what an AR experience should feel like,” the company writes.

The headset design uses two fast-refreshing 3.5-inch LCD displays with a resolution of 1600×1440 per eye. The displays reflect their light onto a visor that the user perceives as a transparent overlay. Leap Motion says this offers a field of view that’s 95 degrees high and 70 degrees wide, larger than most AR systems that exist today. The Leap Motion sensor fits above the eyes and tracks hand motion across a far wider field of view, around 180 degrees horizontal and vertical.

Over the past several decades, researchers have moved from using electric currents to manipulating light waves in the near-infrared range for telecommunications applications such as high-speed 5G networks, biosensors on a chip, and driverless cars. This research area, known as integrated photonics, is fast evolving and investigators are now exploring the shorter—visible—wavelength range to develop a broad variety of emerging applications. These include chip-scale LIDAR (light detection and ranging), AR/VR/MR (augmented/virtual/mixed reality) goggles, holographic displays, quantum information processing chips, and implantable optogenetic probes in the brain.

The one device critical to all these applications in the is an optical phase modulator, which controls the phase of a light wave, similar to how the phase of radio waves is modulated in wireless computer networks. With a phase modulator, researchers can build an on-chip that channels light into different waveguide ports. With a large network of these optical switches, researchers could create sophisticated integrated optical systems that could control light propagating on a tiny chip or light emission from the chip.

But phase modulators in the visible range are very hard to make: there are no materials that are transparent enough in the visible spectrum while also providing large tunability, either through thermo-optical or electro-optical effects. Currently, the two most suitable materials are silicon nitride and lithium niobate. While both are highly transparent in the visible range, neither one provides very much tunability. Visible-spectrum phase modulators based on these materials are thus not only large but also power-hungry: the length of individual waveguide-based modulators ranges from hundreds of microns to several mm and a single modulator consumes tens of mW for phase tuning. Researchers trying to achieve large-scale integration—embedding thousands of devices on a single microchip—have, up to now, been stymied by these bulky, energy-consuming devices.

Working at the intersection of hardware and software engineering, researchers are developing new techniques for improving 3D displays for virtual and augmented reality technologies.

Virtual and augmented reality headsets are designed to place wearers directly into other environments, worlds and experiences.

While the technology is already popular among consumers for its immersive quality, there could be a future where the holographic displays look even more like real life. In their own pursuit of these better displays, the Stanford Computational Imaging Lab has combined their expertise in optics and artificial intelligence. Their most recent advances in this area are detailed in a paper published in Science Advances and work that will be presented at SIGGRAPH ASIA 2021 in December.

In the wee morning hours of Tuesday (Nov. 16), the seven-person crew of the International Space Station (ISS) awoke in alarm. A Russian missile test had just blasted a decommissioned Kosmos spy satellite into more than 1,500 pieces of space debris — some of which were close enough to the ISS to warrant emergency collision preparations.

The four Americans, one German and two Russian cosmonauts aboard the station were told to shelter in the transport capsules that brought them to the ISS, while the station passed by the debris cloud several times over the following hours, according to NASA.

Ultimately, Tuesday ended without any reported damage or injury aboard the ISS, but the crew’s precautions — and the NASA administrator’s stern response to Russia — were far from an overreaction. Space debris like the kind created in the Kosmos break-up can travel at more than 17,500 mph (28,000 km/h), NASA says — and even a scrap of metal the size of a pea can become a potentially deadly missile in low-Earth orbit. (For comparison, a typical bullet discharged from an AR-15 rifle travels at just over 2,200 mph, or 3,500 km/h).


A Russian missile test blasted a Kosmos spy satellite into more than 1,500 pieces of space debris.

Facebook’s vision of the Metaverse has been criticized by both consumers & other companies for its obvious dystopian outlook. But one of the most prominent Augmented Reality Companies in the world, Niantic has shown a much better looking futuristic vision of the metaverse. One in which the real world would only get augmented instead of completely replaced like in Meta’s vision of it. Niantic’s Lightship platform and future augmented reality glasses are meant to be a look into a future where privacy and social interactions are of uttermost importance and the dystopian nightmare future wouldn’t be a big problem. Let’s see what companies such as Apple or niantic think of this.

TIMESTAMPS:
00:00 The unfortunate fate of the Metaverse.
02:01 What is this future going to look like?
03:59 Facebook’s Creepy Vision of the Workplace.
06:29 A possible solution by Niantic.
08:35 Last Words.

#facebook #meta #metaverse

Apple and Meta are heading toward a collision course around wearables, AR/VR headsets and home devices. Also: Netflix and Apple mend fences around billing, Tim Cook talks cryptocurrency, and a new Apple Store is coming to Los Angeles. Finally, the App Store is dealt a loss in court.

For the past decade or so, Apple Inc.’s chief rival was considered to be Google. The two have gone toe-to-toe in smartphones, mobile operating systems, web services and home devices.

The next decade, however, could be defined by Apple’s rivalry with another Silicon Valley giant: Meta Platforms Inc.—the company known to everyone other than its own brand consultants as Facebook.