Toggle light / dark theme

In the wee morning hours of Tuesday (Nov. 16), the seven-person crew of the International Space Station (ISS) awoke in alarm. A Russian missile test had just blasted a decommissioned Kosmos spy satellite into more than 1,500 pieces of space debris — some of which were close enough to the ISS to warrant emergency collision preparations.

The four Americans, one German and two Russian cosmonauts aboard the station were told to shelter in the transport capsules that brought them to the ISS, while the station passed by the debris cloud several times over the following hours, according to NASA.

Ultimately, Tuesday ended without any reported damage or injury aboard the ISS, but the crew’s precautions — and the NASA administrator’s stern response to Russia — were far from an overreaction. Space debris like the kind created in the Kosmos break-up can travel at more than 17,500 mph (28,000 km/h), NASA says — and even a scrap of metal the size of a pea can become a potentially deadly missile in low-Earth orbit. (For comparison, a typical bullet discharged from an AR-15 rifle travels at just over 2,200 mph, or 3,500 km/h).


A Russian missile test blasted a Kosmos spy satellite into more than 1,500 pieces of space debris.

Facebook’s vision of the Metaverse has been criticized by both consumers & other companies for its obvious dystopian outlook. But one of the most prominent Augmented Reality Companies in the world, Niantic has shown a much better looking futuristic vision of the metaverse. One in which the real world would only get augmented instead of completely replaced like in Meta’s vision of it. Niantic’s Lightship platform and future augmented reality glasses are meant to be a look into a future where privacy and social interactions are of uttermost importance and the dystopian nightmare future wouldn’t be a big problem. Let’s see what companies such as Apple or niantic think of this.

TIMESTAMPS:
00:00 The unfortunate fate of the Metaverse.
02:01 What is this future going to look like?
03:59 Facebook’s Creepy Vision of the Workplace.
06:29 A possible solution by Niantic.
08:35 Last Words.

#facebook #meta #metaverse

Apple and Meta are heading toward a collision course around wearables, AR/VR headsets and home devices. Also: Netflix and Apple mend fences around billing, Tim Cook talks cryptocurrency, and a new Apple Store is coming to Los Angeles. Finally, the App Store is dealt a loss in court.

For the past decade or so, Apple Inc.’s chief rival was considered to be Google. The two have gone toe-to-toe in smartphones, mobile operating systems, web services and home devices.

The next decade, however, could be defined by Apple’s rivalry with another Silicon Valley giant: Meta Platforms Inc.—the company known to everyone other than its own brand consultants as Facebook.

WIRED sat down with West to sift fantasy from reality and pin down what XR is actually good at. And it may come as a surprise that a lot of it relies on collecting a lot of data. The following interview is a transcript of our conversation, lightly edited for clarity and length.

WIRED: So let’s start with sort of an ontological question. There’s been this idea that we’ll be in or go to the metaverse, or several metaverses, which tech companies posit will exist in VR or AR. Do you see VR and AR as being more of a tool or a destination?

Timoni West: That’s a great question. I would actually say neither. I see XR as one of the many different mediums you could choose to work in. For example, we actually have an AR mobile companion app [in beta] that allows you to scan a space and gray box it out, put down objects, automatically tag things. So I’m using AR to do the things that AR is best for. I’ll use VR to do the things that VR is best for, like presence, being able to meet together, sculpt, or do anything that’s, you know, sort of intrinsically 3D.

Smartphones have become old technology by now and is soon going to be replaced by the next big thing. Vuzix has created the first lightweight Smart AR Glasses that can project holograms at high contrast while still looking like regular glasses.

The Vuzix next generation smart glasses feature futuristic micro-LED display technology to project augmented reality images onto the glasses. You can interact with virtual objects and more. Companies like Apple and Facebook will soon follow with their own variations of AR Glasses due to them soon replacing smartphones as the main medium of interaction as they phones become obsolete.

If you enjoyed this video, please consider rating this video and subscribing to our channel for more frequent uploads. Thank you! smile

#vuzix #smartphones #augmentedreality

An innovator in early AR systems has a dire prediction: the metaverse could change the fabric of reality as we know it.

Louis Rosenberg, a computer scientist and developer of the first functional AR system at the Air Force Research Laboratory, penned an op-ed in Big Think this weekend that warned the metaverse — an immersive VR and AR world currently being developed by The Company Formerly Known as Facebook — could create what sounds like a real life cyberpunk dystopia.

“I am concerned about the legitimate uses of AR by the powerful platform providers that will control the infrastructure,” Rosenberg wrote in the essay.

Varjo’s XR-3 headset has perhaps the best passthrough view of any MR headset on the market thanks to color cameras that offer a fairly high resolution and a wide field-of-view. But rather than just using the passthrough view for AR (bringing virtual objects into the real world) Varjo has developed a new tool to do the reverse (bringing real objects into the virtual world).

At AWE 2021 this week I got my first glimpse at ‘Varjo Lab Tools’, a soon-to-be released software suite that will work with the company’s XR-3 mixed reality headset. The tool allows users to trace arbitrary shapes that then become windows into the real world, while the rest of the view remains virtual.

Nvidia’s Omniverse, billed as a “metaverse for engineers,” has grown to more than 700 companies and 70,000 individual creators that are working on projects to simulate digital twins that replicate real-world environments in a virtual space.

The Omniverse is Nvidia’s simulation and collaboration platform delivering the foundation of the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. Omniverse is now moving from beta to general availability, and it has been extended to software ecosystems that put it within reach of 40 million 3D designers.

And today during Nvidia CEO Jensen Huang’s keynote at the Nvidia GTC online conference, Nvidia said it has added features such as Omniverse Replicator, which makes it easier to train AI deep learning neural networks, and Omniverse avatar, which makes it simple to create virtual characters that can be used in the Omniverse or other worlds.

French startup Lynx launched a Kickstarter campaign for Lynx R-1 in October, a standalone MR headset which is capable of both VR and passthrough AR. Starting at €530 (or $500 if you’re not subject to European sales tax), the MR headset attracted a strong response from backers as it passed its initial funding goal in under 15 hours, going on to garner over $800,000 throughout the month-long campaign.

Update (November 10th, 2021): Lynx R-1 Kickstarter is now over, and it’s attracted €725,281 (~$835,000) from 1,216 backers. In the final hours the campaign managed to pass its first stretch goal at $700,000—a free facial interface pad.

If you missed out, the company is now offering direct preorders for both its Standard Edition for $600 and Enterprise Edition for $1,100. It’s also selling a few accessories including compatible 6DOF controllers, facial interfaces, and a travel case.

By Jeremy Batterson 11-09-2021

The equivalent of cheap 100-inch binoculars will soon be possible. This memo is a quick update on seven rapidly converging technologies that augur well for astronomy enthusiasts of the near future. All these technologies already exist in either fully developed or nascent form, and all are being rapidly improved due to the gigantic global cell phone market and the retinal projection market that will soon replace it. Listed here are the multiple technologies, after which they are brought together into a single system.

1) Tracking.
2) Single-photon image sensing.
3) Large effective exit pupils via large sensors.
4) Long exposure non-photographic function.
5) Flat optics (metamaterials)
6) Off-axis function of flat optics.
7) Retinal projection.

1) TRACKING: this is already being widely used in so-called “go-to” telescopes, where the instrument will find any object and track it, so Earth’s rotation does not take the object viewed out of the field of vision. The viewer doesn’t have to find the object and doesn’t have to set up the clock drive to track it. Tracking is also partly used in image stabilization software for cameras and smart phones, to prevent motion blurring of images.

2) SINGLE-PHOTON IMAGE SENSORS, whether of the single-photon avalanching diode type, or the type developed by Dr. Fossum, will allow passive imaging in nearly totally dark environments, without the use of IR or other illumination. This new type of image sensor will replace the monochromatic analogue “night-vision” devices, allowing color imaging at higher resolution than they can produce. Unlike these current devices, such sensors will not be destroyed by being exposed to normal or high lighting. Effectively, these sensors increase the effective light-gathering power of a telescope by at least an order of magnitude, allowing small telescopes to see what observatory telescopes see now.

3) EXIT PUPIL: The pupil of the dark-adapted human eye is around 7mm, which means light exiting a telescope must not have a wider-cross axis than this, or a percent of the light captured by the objective lens or mirror will be lost. If the magnification of a system is lowered, to give brighter images, this is limited by this roadblock. This is a well-known problem for visual astronomers. Astro-photographers get around this by two tricks. The first is to use a photographic sensor wider than 7mm, allowing a larger exit pupil and thus brighter images. A 1-inch sensor or photographic plate, for example, already allows an image thirteen times brighter than what a 7mm human pupil can see.

4) LONG EXPOSURE: The other trick astro-photographers use is to keep the shutter of their cameras open for longer periods, thus capturing more light, and allowing a bright image of a faint object to build up over time. As a telescope tracks the stars–so that they appear motionless in the telescopic view–this can be done for hours. The Hubble Space Telescope took a 100 hour long-exposure photograph leading to the famous “deep field” of ultra-faint distant galaxies. An example of a visual use of the same principle is the Sionyx Pro camera, which keeps the shutter open for a fraction of a second. If the exposures are short enough, a video can be produced which appears brighter than what the unaided eye sees. Sionyx adds to this with its black-silicon sensors, which are better at retaining all light that hits them. For astronomy, where stellar objects do not move and do not cause blurring if they are tracked, longer exposures can be created, with the image rapidly brightening as the viewer watches. Unistellar’s eVscope and Vaonis’s Stellina telescope, already use this function, but without an eyepiece. Instead, their images are projected onto people’s cell phones or other viewing devices. However, most astronomers want to be able to see something directly with their eyes, which is a limiting point on such types of telescopes.