Toggle light / dark theme

Peter Nesswhy is he so eager for Biden’s approval? Doesn’t he have parents?

Eric KlienAdmin.

Peter Ness Biden keeps saying that GM is the U.S. leader in EVs and this ticks off Elon. In fact, the old GM plant in California that Tesla bought for a big $42 million produced more cars than any other U.S. factory last year. (All EVs, of course.)

Genevieve Klien shared a link.


While the second leading cryptocurrency is getting ready for its highly anticipated upgrade, platforms keep moving away to other networks.

Listen to article.

Surgeries require a lot of planning, practice, and precision. Doctors cannot afford to get distracted or lose focus when operating on a person. The use of AI in surgery aims to support doctors and supply them with the necessary information and surgical tools without disturbing them at any point.

Mixed reality makes it possible to use technology to assist doctors during surgeries and minimize risks.

Paul Milgram and Fumio Kishino first introduced the term mixed reality in 1994 in their paper titled A Taxonomy of Mixed Reality Visual Displays. MR combines computer vision, cloud computing, graphical processing, etc., to blend the physical and virtual worlds. Many companies have been developing MR applications that can be used in various industries.

Has the first phase of a new AI. Once the AI Research SuperCluster (RSC) is fully built out later this year, the company believes it will be the fastest AI supercomputer on the planet, capable of “performing at nearly 5 exaflops of mixed precision compute.”

The company says RSC will help researchers develop better AI models that can learn from trillions of examples. Among other things, the models will be able to build better augmented reality tools and “seamlessly analyze text, images and video together,” according to Meta. Much of this work is in service of its vision for the metaverse, in which it says AI-powered apps and products will have a key role.

“We hope RSC will help us build entirely new AI systems that can, for example, power real-time voice translations to large groups of people, each speaking a different language, so they can seamlessly collaborate on a research project or play an AR game together,” technical program manager Kevin Lee and software engineer Shubho Sengupta wrote.

Google’s AR headsets, internally codenamed Project Iris, are expected to be released in 2024. Its device uses “outward-facing cameras to blend computer graphics with a video feed of the real world, creating a more immersive, mixed reality experience than existing AR glasses.” The hardware is “powered by a custom Google processor, like its newest Google Pixel smartphone, and runs on Android, though recent job listings indicate that a unique OS is in the works.”

Google Glass, the prior foray into AR, didn’t gain widespread consumer interest or adoption. The Verge says that the work on the project began to pick up speed recently. As of now, there isn’t a “clearly defined go-to-market strategy.” According to the Verge, Google is keeping the project secret, requiring “special keycard access” and “non-disclosure agreements.”

Facebook said it would hire around 10,000 workers around the world to build Meta and related products. A search on LinkedIn’s job board for “metaverse” shows thousands of listings. For people looking for fast-growing opportunities, you may want to look at pivoting into virtual and augmented reality and related opportunities in the metaverse.

Major brands are also getting into the NFT mix, including Dolce & Gabbana, Coca-Cola, Adidas, and Nike. In the future, when you buy a physical world item from a company, you might also gain ownership of a linked NFT in the metaverse.

For example, when you buy that coveted name-brand outfit to wear to the real-world dance club, you might also become the owner of the crypto version of the outfit that your avatar can wear to the virtual Ariana Grande concert. And just as you could sell the physical outfit secondhand, you could also sell the NFT version for someone else’s avatar to wear.

These are a few of the many ways that metaverse business models will likely overlap with the physical world. Such examples will get more complex as augmented reality technologies increasingly come into play, further merging aspects of the metaverse and physical world. Although the metaverse proper isn’t here yet, technological foundations like blockchain and crypto assets are steadily being developed, setting the stage for a seemingly ubiquitous virtual future that is coming soon to a ‘verse near you.

Bernard Kress, principal optical architect on Microsoft’s HoloLens team, has left the company to take on the role of Director of XR Engineering at the recently formed Google Labs. A report by The Verge maintains Google is also now gearing up to produce an AR headset that could directly compete with similar offerings from the likes of Apple and Meta.

Before joining Microsoft in 2015, Kress worked as principal optical architect behind Google Glass, the company’s smartglasses that found marked success in the enterprise sector after a rocky reception by consumers in 2013.

At Microsoft, Kress continued his work—principally focused on micro-optics, wafer scale optics, holography and nanophotonics—as partner optical architect on the HoloLens team, overseeing the release of both HoloLens and HoloLens 2.

Prager Metis has become the first CPA firm to open up a metaverse headquarters. The firm, which in real life is based in New York, is setting up shop in Decentraland, a 3D virtual world, as part of a joint venture with Banquet LLC, a metaverse studio.

The firm purchased the piece of virtual real estate on Dec. 28, a three-story digital structure. On the first floor is an open floor plan that doubles as a gallery space for nonfungible tokens from Prager Metis clients along with a large entertainment area. The second floor will provide more of a working space with meeting rooms and conference capabilities. The third floor will serve as a rooftop space where Prager Metis intends to host events and even live entertainment.

The metaverse has been attracting attention ever since Facebook’s parent company announced a name change last October to Meta to highlight its interest in developing technology for virtual reality and augmented reality. More businesses have followed suit in setting up shop in the metaverse. Prager Metis isn’t the first firm to dip its toes in the waters: PricewaterhouseCoopers’ Hong Kong firm announced last month that it had bought virtual land on another metaverse platform, the Sandbox, but Prager Metis is going further by setting up an actual headquarters in Decentraland. It plans to focus on advisory services for clients and potentially for other accounting firms as well. The firm already has clients who have entered the rapidly growing market for nonfungible tokens, or NFTs, which use blockchain technology to create collectibles and artwork that people bid on to buy and trade.

Fittingbox’s Frame Removal uses diminished reality to help people pick out new eyeglasses — but the tech’s potential extends far beyond the bridge of your nose.


French company Fittingbox has just unveiled an app that uses a technology called “diminished reality” — the opposite of augmented reality (AR).

The app is designed to help shoppers pick out new eyeglasses, but the tech’s potential extends far beyond the bridge of your nose.

The challenge: Many eyeglass sellers now let you try on specs virtually — just pick a pair off a website, look into the camera on your phone or computer, and thanks to the magic of augmented reality (AR), you can see what the frames look like on your face.

Nreal is a China-based startup behind the Nreal Light AR glasses, which aim for a sunglasses-like design. By hooking it up to your (Android) phone, it’s able to project virtual objects in your real environment and even allow you to walk around with position tracking. While we’re not quite there yet, I think the Nreal Light is definitely getting us closer to fully fledged AR glasses.

Cas & Chary Present

Cas and Chary VR is a YouTube channel hosted by Netherland-based duo Casandra Vuong and Chary Keijzer who have been documenting their VR journeys since 2016. They share a curated selection of their content with extra insights for the Road to VR audience.

Of course, a minimum level of fidelity is required, but what’s far more important is perceptual consistency. By this, I mean that all sensory signals (i.e. sight, sound, touch, and motion) feed a single mental model of the world within your brain. With augmented reality, this can be achieved with relatively low visual fidelity, as long as virtual elements are spatially and temporally registered to your surroundings in a convincing way. And because our sense of distance (i.e. depth perception) is relatively coarse, it’s not hard for this to be convincing.

But for virtual reality, providing a unified sensory model of the world is much harder. This might sound surprising because it’s far easier for VR hardware to provide high-fidelity visuals without lag or distortion. But unless you’re using elaborate and impractical hardware, your body will be sitting or standing still while most virtual experiences involve motion. This inconsistency forces your brain to build and maintain two separate models of your world — one for your real surroundings and one for the virtual world that is presented in your headset.

When I tell people this, they often push back, forgetting that regardless of what’s happening in their headset, their brain still maintains a model of their body sitting on their chair, facing a particular direction in a particular room, with their feet touching the floor (etc.). Because of this perceptual inconsistency, your brain is forced to maintain two mental models. There are ways to reduce the effect, but it’s only when you merge real and virtual worlds into a single consistent experience (i.e. foster a unified mental model) that this truly gets solved.