Toggle light / dark theme

Lex Fridman Podcast full episode: https://www.youtube.com/watch?v=I845O57ZSy4
Please support this podcast by checking out our sponsors:
- InsideTracker: https://insidetracker.com/lex to get 20% off.
- Indeed: https://indeed.com/lex to get $75 credit.
- Blinkist: https://blinkist.com/lex and use code LEX to get 25% off premium.
- Eight Sleep: https://www.eightsleep.com/lex and use code LEX to get special savings.
- Athletic Greens: https://athleticgreens.com/lex and use code LEX to get 1 month of fish oil.

GUEST BIO:
John Carmack is a legendary programmer, co-founder of id Software, and lead programmer of many revolutionary video games including Wolfenstein 3D, Doom, Quake, and the Commander Keen series. He is also the founder of Armadillo Aerospace, and for many years the CTO of Oculus VR.

PODCAST INFO:
Podcast website: https://lexfridman.com/podcast.
Apple Podcasts: https://apple.co/2lwqZIr.
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
Full episodes playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4
Clips playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOeciFP3CBCIEElOJeitOr41

SOCIAL:

A Brain-Computer Interface (BCI) is a promising technology that has received increased attention in recent years. BCIs create a direct link from your brain to a computer. This technology has applications to many industries and sectors of our life. BCIs redefine how we approach medical treatment and communication for individuals with various conditions or injuries. BCIs also have applications in entertainment, specifically video games and VR. From being able to control a prosthetic limb with your mind, to being able to play a video game with your mind—the potential of BCIs are endless.

What are your thoughts on Brain-Computer Interfaces? Let us know!
Any disruptive technologies you would like us to cover? Dm us on our Instagram (@toyvirtualstructures).
—————–
Check out our curated playlists:
https://www.youtube.com/channel/UCr5Akn6LhGDin7coWM7dfUg/playlists.
—————–
Media Used:

Tom Oxley | TED

—————–

Ignore the ribbons, this is a very promising breakthrough for VR.


Researchers from Stanford University and Nvidia have teamed up to help develop VR glasses that look a lot more like regular spectacles. Okay, they are rather silly looking due to the ribbons extended from either eye, but they’re much, much flatter and compact than your usual goggle-like virtual reality headsets today.

“A major barrier to widespread adoption of VR technology, however, is the bulky form factor of existing VR displays and the discomfort associated with that,” the research paper published at Siggraph 2022 (opens in new tab) says.

TWITTER https://twitter.com/Transhumanian.
PATREON https://www.patreon.com/transhumania.
BITCOIN 14ZMLNppEdZCN4bu8FB1BwDaxbWteQKs8i.
BITCOIN CASH 1LhXJjN4FrfJh8LywR3dLG2uGXSaZjey9f.
ETHEREUM 0x1f89b261562C8D4C14aA01590EB42b2378572164
LITECOIN LdB94n8sTUXBto5ZKt82YhEsEmxomFGz3j.
CHAINLINK 0xDF560E12fF416eC2D4BAECC66E323C56af2f6666.

KEYWORDS:
Scienc, Technology, Philosophy, Futurism, Simulation, Simulationism, Ockham’s Razor, Argument, Hypothesis, Anthropic Principle, Holographic Principle, Holographic Universe theory, Brain in a Vat, Brain in a Jar, Matrix, Inception, Hologram, Artificial Intelligence, Vocaloid Hologram, Reality, Ontology, Epistemology, Elon Musk, Solipsism, Illusion, Renee Descartes, George Berkeley, Materialism, Idealism, Solipsism, Cogito Ergo Sum, Esse es Percepi, Gilbert Harman, Hillary Putnam, Robert Nozick, The Experience Machine, Multiverse, Omniverse, Black Hole Hologram, Event Horizon, Singularity, Moore’s Law, Black Dual Linear Error Correcting Code, Jim James Sylvester Gates, Neil DeGrasse Tyson, Plato’s Cave, Quantum Mechanics, Schrondinger’s Cat, Observer Effect, Double-Slit Experiment, Heisenberg Uncertainty Principle, Artificial Intelligence, Quantum Superdeterminism, Free Will, Albert Einstein, Stephen Hawking, Quantum Gravity Research, Eugene Vignor, Consciousness, Fermi Paradox, SETI (search for extraterrestrial intelligence), Drake Equation, Alpha Centauri, Wave Function Collapse, Video Games, VR, Virtual Reality, God, Theology, Fine-Tuning Argument, Teleological Argument, Simulationism, Mormonism, LDS, Heavenly Father, Glitch, Dyson Sphere.

Unlike many of the so-called “artists” strewing junk around the halls of modern art museums, Jordan Tanner is actually pushing the frontiers of his craft. His eclectic portfolio includes vaporwave-inspired VR experiences, NFTs & 3D-printed figurines for Popular Front, and animated art for this very magazine. His recent AI-generated art made using OpenAI’s DALL-E software was called “STUNNING” by Lee Unkrich, the director of Coco and Toy Story 3.

We interviewed the UK-born, Israel-based artist about the imminent AI-generated art revolution and why all is not lost when it comes to the future of art. In Tanner’s eyes, AI-generated art is similar to having the latest, flashiest Nikon camera—it doesn’t automatically make you a professional photographer. Tanner also created a series of unique, AI-generated pieces for this interview which can be enjoyed below.

Thanks for talking to Countere, Jordan. Can you tell us a little about your background as an artist?

It’s an everyday scenario: you’re driving down the highway when out of the corner of your eye you spot a car merging into your lane without signaling. How fast can your eyes react to that visual stimulus? Would it make a difference if the offending car were blue instead of green? And if the color green shortened that split-second period between the initial appearance of the stimulus and when the eye began moving towards it (known to scientists as the saccade), could drivers benefit from an augmented reality overlay that made every merging vehicle green?

Qi Sun, a joint professor in Tandon’s Department of Computer Science and Engineering and the Center for Urban Science and Progress (CUSP), is collaborating with neuroscientists to find out.

He and his Ph.D. student Budmonde Duinkharjav—along with colleagues from Princeton, the University of North Carolina, and NVIDIA Research—recently authored the paper “Image Features Influence Reaction Time: A Learned Probabilistic Perceptual Model for Saccade Latency,” presenting a model that can be used to predict temporal gaze behavior, particularly saccadic latency, as a function of the statistics of a displayed image. Inspired by neuroscience, the model could ultimately have great implications for , telemedicine, e-sports, and in any other arena in which AR and VR are leveraged.

Researchers have developed a new chip-based beam steering technology that provides a promising route to small, cost-effective and high-performance lidar (or light detection and ranging) systems. Lidar, which uses laser pulses to acquire 3D information about a scene or object, is used in a wide range of applications such as autonomous driving, free-space optical communications, 3D holography, biomedical sensing and virtual reality.

Optica l beam steering is a key technology for lidar systems, but conventional mechanical-based beam steering systems are bulky, expensive, sensitive to vibration and limited in speed,” said research team leader Hao Hu from the Technical University of Denmark. “Although devices known as chip-based optical phased arrays (OPAs) can quickly and precisely steer light in a non-mechanical way, so far, these devices have had poor beam quality and a field of view typically below 100 degrees.”

In Optica, Hu and co-author Yong Liu describe their new chip-based OPA that solves many of the problems that have plagued OPAs. They show that the device can eliminate a key optical artifact known as aliasing, achieving beam steering over a large field of view while maintaining high beam quality, a combination that could greatly improve lidar systems.

View insights.


In a paper distributed via ArXiv, titled “Exploring the Unprecedented Privacy Risks of the Metaverse,” boffins at UC Berkeley in the US and the Technical University of Munich in Germany play-tested an “escape room” virtual reality (VR) game to better understand just how much data a potential attacker could access. Through a 30-person study of VR usage, the researchers – Vivek Nair (UCB), Gonzalo Munilla Garrido (TUM), and Dawn Song (UCB) – created a framework for assessing and analyzing potential privacy threats. They identified more than 25 examples of private data attributes available to potential attackers, some of which would be difficult or impossible to obtain from traditional mobile or web applications. The metaverse that is rapidly becoming a part of our world has long been an essential part of the gaming community. Interaction-based games like Second Life, Pokemon Go, and Minecraft have existed as virtual social interaction platforms. The founder of Second Life, Philip Rosedale, and many other security experts have lately been vocal about Meta’s impact on data privacy. Since the core concept is similar, it is possible to determine the potential data privacy issues apparently within Meta.

There has been a buzz going around the tech market that by the end of 2022, the metaverse can revive the AR/VR device shipments and take it as high as 14.19 million units, compared to 9.86 million in 2021, indicating a year-over-year increase of about 35% to 36%. The AR/VR device market will witness an enormous boom in the market due to component shortages and the difficulty to develop new technologies. The growth momentum will also be driven by the increased demand for remote interactivity stemming from the pandemic. But what will happen when these VR or metaverse headsets start stealing your precious data? Not just headsets but smart glasses too are prime suspect when it comes to privacy concerns.

Several weeks ago, Facebook introduced a new line of smart glasses called Ray-Ban Stories, which can take photos, shoot 30-second videos, and post them on the owner’s Facebook feed. Priced at US$299 and powered by Facebook’s virtual assistant, the web-connected shades can also take phone calls and play music or podcasts.