Toggle light / dark theme

On Monday, Apple is more than likely going to reveal its long-awaited augmented or mixed reality Reality Pro headset during the keynote of its annual WWDC developer conference in California. It’s an announcement that has been tipped or teased for years now, and reporting on the topic has suggested that at various times, the project has been subject to delays, internal skepticism and debate, technical challenges and more. Leaving anything within Apple’s sphere of influence aside, the world’s overall attitude toward AR and VR has shifted considerably — from optimism, to skepticism.

Part of that trajectory is just the natural progression of any major tech hype cycle, and you could easily argue that the time to make the most significant impact in any such cycle is after the spike of undue optimism and energy has subsided. But in the case of AR and VR, we’ve actually already seen some of the tech giants with the deepest pockets take their best shots and come up wanting — not for lack of trying, but because of limitations in terms of what’s possible even at the bleeding edge of available tech. Some of those limits might actually be endemic to AR and VR, too, because of variances in the human side of the equation required to make mixed reality magic happen.

The virtual elephant in the room is, of course, Meta. The name itself pretty much sums up the situation: Facebook founder Mark Zuckerberg read a bad book and decided that VR was the inevitable end state of human endeavor — the mobile moment he essentially missed out on, but even bigger and better. Zuckerberg grew enamored by his delusion, first acquiring crowdfunded VR darling Oculus, then eventually commandeering the sobriquet for a shared virtual universe from the dystopian predictions of a better book and renaming all of Facebook after it.

Alright, Apple. Pack it up, show’s over. Mark Zuckerberg just dropped a surprise announcement, unveiling the Meta Quest 3 virtual reality headset. The reveal comes just *checks calendar* four days before Apple is set to officially announce its high-end mixed reality headset. So much for Apple’s headset hype, I guess.

All kidding aside, Meta bought its way into virtual reality years ago through Oculus when Zuck’s company was still trading under the name Facebook. Meta certainly has more experience selling VR headsets and throwing VR parties than Apple – for now, at least.

Like the Meta Quest 2, the newly unveiled Meta Quest 3 is also in another universe when it comes to pricing. Apple’s “Reality Pro” headset is expected to carry a price tag of around $3,000. Meta’s higher-end Quest Pro goes for $999, and the new Meta Quest 3 is priced from $499.

Missed the GamesBeat Summit excitement? Don’t worry! Tune in now to catch all of the live and virtual sessions here.

3DFY.ai announced the launch of 3DFY Prompt, a generative AI that lets developers and creators build 3D models based on text prompts.

The Tel Aviv, Israel-based company said the tech democratizes professional-quality 3D model creation, enabling anyone to use text prompts to create high-quality models that can be used in gaming, design or virtual environments.

But they’re not the only ones. Multiple companies are working on haptic devices, like gloves or vests, to add a sense of touch to virtual experiences. And now, researchers are aiming to integrate a fourth sense: smell.

How much more real might that peaceful meadow feel if you could smell the wildflowers and the damp Earth around you? How might the scent of an ocean breeze amplify a VR experience that takes place on a boat or a beach?

Scents have a powerful effect on the brain, eliciting emotions, memories, and sometimes even fight-or-flight responses. You may feel nostalgic with the cologne or perfume a favorite grandparent wore, comforted by a whiff of a favorite food, or extra-alert to your surroundings if it smells like something’s burning.

They could help lead to speech apps for many more languages than exist now.

Meta has built AI models that can recognize and produce speech for more than 1,000 languages—a tenfold increase on what’s currently available. It’s a significant step toward preserving languages that are at risk of disappearing, the company says.

Meta is releasing its models to the public via the code hosting service GitHub. It claims that making them open source will help developers working in different languages to build new speech applications—like messaging services that understand everyone, or virtual-reality systems that can be used in any language.

Apple Inc.’s annual WWDC developers’ conference is fast approaching — and this one promises to be a bit more eventful than most, as the consumer-electronics giant is expected to debut its long-awaited mixed-reality headset.

The company hasn’t made a major product introduction since it rolled out the Apple Watch in 2015, but with an expensive headset supporting augmented and virtual reality, Apple AAPL,-0.88% will be entering into a market that has so far failed to catch on in a mainstream way. Meta Platforms Inc. META, +0.85% has bet big on virtual reality with its Oculus headset, with only limited traction.

Artificial intelligence (AI) and the metaverse are some of the most captivating technologies of the 21st century so far. Both are believed to have the potential to change many aspects of our lives, disrupt different industries, and enhance the efficiency of traditional workflows. While these two technologies are often looked at separately, they’re more connected than we may think. Before we explore the relationship between AI and the metaverse, let’s start by defining both terms.

The metaverse is a concept describing a hypothetical future design of the internet. It features an immersive, 3D online world where users are represented by custom avatars and access information with the help of virtual reality (VR), augmented reality (AR), and similar technologies. Instead of accessing the internet via their screens, users access the metaverse via a combination of the physical and digital. The metaverse will enable people to socialize, play, and work alongside others in different 3D virtual spaces.

A similar arrangement was described in Neal Stephenson’s 1992 science-fiction novel Snow Crash. While it was perceived as a fantasy mere three decades ago, it seems like it could become a reality sooner rather than later. Although the metaverse isn’t fully in existence yet, some online platforms incorporate elements of it. For example, video games like Fortnite and Horizon World port multiple elements of our day-to-day lives into the online world.

The demo is clever, questionably real, and prompts a lot of questions about how this device will actually work.

Buzz has been building around the secretive tech startup Humane for over a year, and now the company is finally offering a look at what it’s been building. At TED last month, Humane co-founder Imran Chaudhri gave a demonstration of the AI-powered wearable the company is building as a replacement for smartphones. Bits of the video leaked online after the event, but the full video is now available to watch.

The device appears to be a small black puck that slips into your breast pocket, with a camera, projector, and speaker sticking out the top. Throughout the 13-minute presentation, Chaudhri walks through a handful of use cases for Humane’s gadget: * The device rings when Chaudhri receives a phone call. He holds his hand up, and the device projects the caller’s name along with icons to answer or ignore the call. He then has a brief conversation. (Around 1:48 in the video) * He presses and holds one finger on the device, then asks a question about where he can buy a gift. The device responds with the name of a shopping district. (Around 6:20) * He taps two fingers on the device, says a sentence, and the device translates the sentence into another language, stating it back using an AI-generated clone of his voice. (Around 6:55) * He presses and holds one finger on the device, says, “Catch me up,” and it reads out a summary of recent emails, calendar events, and messages. (At 9:45) * He holds a chocolate bar in front of the device, then presses and holds one finger on the device while asking, “Can I eat this?” The device recommends he does not because of a food allergy he has. He presses down one finger again and tells the device he’s ignoring its advice. (Around 10:55)

Chaudhri, who previously worked on design at Apple for more than two decades, pitched the device as a salve for a world covered in screens. “Some believe AR / VR glasses like these are the answer,” he said, an image of VR headsets behind him. He argued those devices — like smartphones — put “a further barrier between you and the world.”

Humane’s device, whatever it’s called, is designed to be more natural by eschewing the screen. The gadget operates on its own. “You don’t need a smartphone or any other device to pair with it,” he said.

From 2021

A new method called tensor holography could enable the creation of holograms for virtual reality, 3D printing, medical imaging, and more — and it can run on a smartphone.

YouTube.