Toggle light / dark theme

Microsoft is entering the race to build a metaverse inside Teams, just days after Facebook rebranded to Meta in a push to build virtual spaces for both consumers and businesses. Microsoft is bringing Mesh, a collaborative platform for virtual experiences, directly into Microsoft Teams next year. It’s part of a big effort to combine the company’s mixed reality and HoloLens work with meetings and video calls that anyone can participate in thanks to animated avatars.

With today’s announcement, Microsoft and Meta seem to be on a collision course to compete heavily in the metaverse, particularly for the future of work.

Microsoft Mesh always felt like the future of Microsoft Teams meetings, and now it’s starting to come to life in the first half of 2022. Microsoft is building on efforts like Together Mode and other experiments for making meetings more interactive, after months of people working from home and adjusting to hybrid work.

Apple is looking into how it may change how you view AR (augmented reality) altogether…literally. Instead of projecting an image onto a lens, which is viewed by someone wearing an AR headset or glasses, Apple envisions beaming the image directly onto the user’s eyeball itself.

Apple recently unveiled its upcoming lineup of new products. What it did not showcase, however, was revealed in a recent patent—Apple is shown researching how it can change how we see AR and the future of its “Apple Glass” product, if one comes to exist. The patent reveals how Apple intends to move away from the traditional way of projecting an image onto a lens, to projecting the image directly onto the retina of the wearer. This will be achieved through the use of micro projectors.

The issue that Apple is trying to avoid is the nausea and headaches some people experience while viewing AR and VR (virtual reality). The patent states the issue as “accommodation-convergence mismatch,” which causes eyestrain for some. Apple hopes that by using its “Direct Retinal Projector” it can alleviate those symptoms and make the AR and VR realm accessible for more users.

DigiLens has raised funding from Samsung Electronics in a round that values the augmented reality smart glasses makers at more than $500 million.

Sunnyvale, California-based DigiLens did not say the exact amount it raised for the development of its extended reality glasses (XR), which will offer AR features, such as overlaying digital images on what you see.

DigiLens CEO Chris Pickett said in a previous interview with VentureBeat that the latest smart glasses are more advanced than models the company showed in 2019.

For people with motor impairments or physical disabilities, completing daily tasks and house chores can be incredibly challenging. Recent advancements in robotics, such as brain-controlled robotic limbs, have the potential to significantly improve their quality of life.

Researchers at Hebei University of Technology and other institutes in China have developed an innovative system for controlling robotic arms that is based on augmented reality (AR) and a . This system, presented in a paper published in the Journal of Neural Engineering, could enable the development of bionic or prosthetic arms that are easier for users to control.

“In recent years, with the development of robotic arms, brain science and information decoding technology, brain-controlled robotic arms have attained increasing achievements,” Zhiguo Luo, one of the researchers who carried out the study, told TechXplore. “However, disadvantages like poor flexibility restrict their widespread application. We aim to promote the lightweight and practicality of brain-controlled robotic arms.”

Oculus Quest 2: https://amzn.to/3w0j6dO (Affiliate link)
Ray Ban Stories: https://amzn.to/3mv62tT (Affiliate link)

Timestamps:
00:00 Intro.
01:04: Brief History of Facebook.
04:47: VR & AR Today.
14:21: Mark Zuckerberg’s Master Plan.
23:19: Support Perhaps?

Links (In order of appearance):

October 28 2021: Mark Zuckerberg talking at Facebook Connect: [https://fb.watch/8X18pssy2q/]

In Optica, The Optical Society’s (OSA) journal for high impact research, Qiu and colleagues describe a new approach for digitizing color. It can be applied to cameras and displays — including ones used for computers, televisions and mobile devices — and used to fine-tune the color of LED lighting.

“Our new approach can improve today’s commercially available displays or enhance the sense of reality for new technologies such as near-eye-displays for virtual reality and augmented reality glasses,” said Jiyong Wang, a member of the PAINT research team. “It can also be used to produce LED lighting for hospitals, tunnels, submarines and airplanes that precisely mimics natural sunlight. This can help regulate circadian rhythm in people who are lacking sun exposure, for example.”

The discovery demonstrates a practical method to overcome current challenges in the manufacture of indium gallium nitride (InGaN) LEDs with considerably higher indium concentration, through the formation of quantum dots that emit long-wavelength light. The researchers have uncovered a new way t.


A type of group-III element nitride-based light-emitting diode (LED), indium gallium nitride (InGaN) LEDs were first fabricated over two decades ago in the 90s, and have since evolved to become ever smaller while growing increasingly powerful, efficient, and durable. Today, InGaN LEDs can be found across a myriad of industrial and consumer use cases, including signals & optical communication and data storage – and are critical in high-demand consumer applications such as solid state lighting, television sets, laptops, mobile devices, augmented (AR) and virtual reality (VR) solutions.

Ever-growing demand for such electronic devices has driven over two decades of research into achieving higher optical output, reliability, longevity and versatility from semiconductors – leading to the need for LEDs that can emit different colors of light. Traditionally, InGaN material has been used in modern LEDs to generate purple and blue light, with aluminum gallium indium phosphide (AlGaInP) – a different type of semiconductor – used to generate red, orange, and yellow light. This is due to InGaN’s poor performance in the red and amber spectrum caused by a reduction in efficiency as a result of higher levels of indium required.

In addition, such InGaN LEDs with considerably high indium concentrations remain difficult to manufacture using conventional semiconductor structures. As such, the realization of fully solid-state white-light-emitting devices – which require all three primary colors of light – remains an unattained goal.

Visit Our Parent Company EarthOne For Sustainable Living Made Simple ➤
https://earthone.io/

Progress has an accelerating rate of change due to the compounding effect of these technologies, in which they will enable countless more from 3D printing, autonomous vehicles, blockchain, batteries, remote surgeries, virtual and augmented reality, robotics – the list can go on and on.

These devices in turn will lead to mass changes in society from energy generation, monetary systems, space colonization and much more! All these topics and then some will be covered in videos of their own in the future.

In this video we will be discussing automation, which is often confused with being the ‘technological revolution’ in it of itself as it is what the mainstream focuses on, and for good reason, as how we handle automation will determine the trajectory or collective future takes.

Well, it’s official. After 17 years of being called Facebook, the social networking parent company behind Facebook, Instagram, WhatsApp, and Oculus has a new name.

Facebook’s corporate entity is now **Meta**.

Facebook creator Mark Zuckerberg announced the change at the company’s AR/VR-focused Connect event, sharing that the new title captured more of the company’s core ambition: to build the metaverse.

“To reflect who we are and what we hope to build, I am proud to announce that starting today, our company is now Meta. Our mission remains the same — it’s still about bringing people together. Our apps and our brands — they’re not changing either,” **Zuckerberg **said. “From now on, we’re going to be metaverse-first, not Facebook-first.”

Full Story: