Menu

Blog

Archive for the ‘virtual reality’ category: Page 23

Aug 25, 2022

Deep Dive: Why 3D reconstruction may be the next tech disruptor

Posted by in categories: augmented reality, robotics/AI, virtual reality

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Artificial intelligence (AI) systems must understand visual scenes in three dimensions to interpret the world around us. For that reason, images play an essential role in computer vision, significantly affecting quality and performance. Unlike the widely available 2D data, 3D data is rich in scale and geometry information, providing an opportunity for a better machine-environment understanding.

Data-driven 3D modeling, or 3D reconstruction, is a growing computer vision domain increasingly in demand from industries including augmented reality (AR) and virtual reality (VR). Rapid advances in implicit neural representation are also opening up exciting new possibilities for virtual reality experiences.

Aug 22, 2022

Does AI need a body? | John Carmack and Lex Fridman

Posted by in categories: robotics/AI, virtual reality

Lex Fridman Podcast full episode: https://www.youtube.com/watch?v=I845O57ZSy4
Please support this podcast by checking out our sponsors:
- InsideTracker: https://insidetracker.com/lex to get 20% off.
- Indeed: https://indeed.com/lex to get $75 credit.
- Blinkist: https://blinkist.com/lex and use code LEX to get 25% off premium.
- Eight Sleep: https://www.eightsleep.com/lex and use code LEX to get special savings.
- Athletic Greens: https://athleticgreens.com/lex and use code LEX to get 1 month of fish oil.

GUEST BIO:
John Carmack is a legendary programmer, co-founder of id Software, and lead programmer of many revolutionary video games including Wolfenstein 3D, Doom, Quake, and the Commander Keen series. He is also the founder of Armadillo Aerospace, and for many years the CTO of Oculus VR.

Continue reading “Does AI need a body? | John Carmack and Lex Fridman” »

Aug 19, 2022

Artemis 1 virtual reality experience aims to bring epic NASA moon launch to you

Posted by in categories: space, virtual reality

An immersive, virtual reality experience will put viewers next to the launch pad as Artemis 1 lifts off for the moon. The mission is scheduled for Aug. 29.

Aug 17, 2022

The Power of Brain-Computer Interfaces | TVS

Posted by in categories: biotech/medical, computing, cyborgs, neuroscience, virtual reality

A Brain-Computer Interface (BCI) is a promising technology that has received increased attention in recent years. BCIs create a direct link from your brain to a computer. This technology has applications to many industries and sectors of our life. BCIs redefine how we approach medical treatment and communication for individuals with various conditions or injuries. BCIs also have applications in entertainment, specifically video games and VR. From being able to control a prosthetic limb with your mind, to being able to play a video game with your mind—the potential of BCIs are endless.

What are your thoughts on Brain-Computer Interfaces? Let us know!
Any disruptive technologies you would like us to cover? Dm us on our Instagram (@toyvirtualstructures).
—————–
Check out our curated playlists:
https://www.youtube.com/channel/UCr5Akn6LhGDin7coWM7dfUg/playlists.
—————–
Media Used:

Continue reading “The Power of Brain-Computer Interfaces | TVS” »

Aug 16, 2022

Researchers find way to shrink a VR headset down to normal glasses size

Posted by in categories: innovation, virtual reality

Ignore the ribbons, this is a very promising breakthrough for VR.


Researchers from Stanford University and Nvidia have teamed up to help develop VR glasses that look a lot more like regular spectacles. Okay, they are rather silly looking due to the ribbons extended from either eye, but they’re much, much flatter and compact than your usual goggle-like virtual reality headsets today.

Continue reading “Researchers find way to shrink a VR headset down to normal glasses size” »

Aug 16, 2022

The Holographic Principle, Quantum Mechanics, and Simulated Reality (SR)

Posted by in categories: alien life, bitcoin, cryptocurrencies, Elon Musk, existential risks, holograms, information science, quantum physics, robotics/AI, singularity, virtual reality

https://www.youtube.com/watch?v=M9fyZvxkpz4

TWITTER
https://twitter.com/Transhumanian.

PATREON https://www.patreon.com/transhumania.
BITCOIN 14ZMLNppEdZCN4bu8FB1BwDaxbWteQKs8i.
BITCOIN CASH 1LhXJjN4FrfJh8LywR3dLG2uGXSaZjey9f.
ETHEREUM 0x1f89b261562C8D4C14aA01590EB42b2378572164
LITECOIN LdB94n8sTUXBto5ZKt82YhEsEmxomFGz3j.
CHAINLINK 0xDF560E12fF416eC2D4BAECC66E323C56af2f6666.

Continue reading “The Holographic Principle, Quantum Mechanics, and Simulated Reality (SR)” »

Aug 16, 2022

The Future of AI-Generated Art Is Here: An Interview With Jordan Tanner

Posted by in categories: blockchains, robotics/AI, virtual reality

Unlike many of the so-called “artists” strewing junk around the halls of modern art museums, Jordan Tanner is actually pushing the frontiers of his craft. His eclectic portfolio includes vaporwave-inspired VR experiences, NFTs & 3D-printed figurines for Popular Front, and animated art for this very magazine. His recent AI-generated art made using OpenAI’s DALL-E software was called “STUNNING” by Lee Unkrich, the director of Coco and Toy Story 3.

We interviewed the UK-born, Israel-based artist about the imminent AI-generated art revolution and why all is not lost when it comes to the future of art. In Tanner’s eyes, AI-generated art is similar to having the latest, flashiest Nikon camera—it doesn’t automatically make you a professional photographer. Tanner also created a series of unique, AI-generated pieces for this interview which can be enjoyed below.

Continue reading “The Future of AI-Generated Art Is Here: An Interview With Jordan Tanner” »

Aug 9, 2022

How image features influence reaction times

Posted by in categories: augmented reality, biotech/medical, neuroscience, virtual reality

It’s an everyday scenario: you’re driving down the highway when out of the corner of your eye you spot a car merging into your lane without signaling. How fast can your eyes react to that visual stimulus? Would it make a difference if the offending car were blue instead of green? And if the color green shortened that split-second period between the initial appearance of the stimulus and when the eye began moving towards it (known to scientists as the saccade), could drivers benefit from an augmented reality overlay that made every merging vehicle green?

Qi Sun, a joint professor in Tandon’s Department of Computer Science and Engineering and the Center for Urban Science and Progress (CUSP), is collaborating with neuroscientists to find out.

He and his Ph.D. student Budmonde Duinkharjav—along with colleagues from Princeton, the University of North Carolina, and NVIDIA Research—recently authored the paper “Image Features Influence Reaction Time: A Learned Probabilistic Perceptual Model for Saccade Latency,” presenting a model that can be used to predict temporal gaze behavior, particularly saccadic latency, as a function of the statistics of a displayed image. Inspired by neuroscience, the model could ultimately have great implications for , telemedicine, e-sports, and in any other arena in which AR and VR are leveraged.

Aug 4, 2022

New chip-based beam steering device lays groundwork for smaller, cheaper lidar

Posted by in categories: biotech/medical, robotics/AI, virtual reality

Researchers have developed a new chip-based beam steering technology that provides a promising route to small, cost-effective and high-performance lidar (or light detection and ranging) systems. Lidar, which uses laser pulses to acquire 3D information about a scene or object, is used in a wide range of applications such as autonomous driving, free-space optical communications, 3D holography, biomedical sensing and virtual reality.

Optica l beam steering is a key technology for lidar systems, but conventional mechanical-based beam steering systems are bulky, expensive, sensitive to vibration and limited in speed,” said research team leader Hao Hu from the Technical University of Denmark. “Although devices known as chip-based optical phased arrays (OPAs) can quickly and precisely steer light in a non-mechanical way, so far, these devices have had poor beam quality and a field of view typically below 100 degrees.”

In Optica, Hu and co-author Yong Liu describe their new chip-based OPA that solves many of the problems that have plagued OPAs. They show that the device can eliminate a key optical artifact known as aliasing, achieving beam steering over a large field of view while maintaining high beam quality, a combination that could greatly improve lidar systems.

Aug 4, 2022

Planetary Debris Disks Discovered with Citizen Scientists and Virtual Reality

Posted by in categories: space, virtual reality

Members of the public are helping professional astronomers identify nascent planetary systems.

Page 23 of 102First2021222324252627Last