Toggle light / dark theme

The Quest for Ultimate Reality: Exploring Experiential Nirvana as a Path to Self-Transcendence

IN THE NEAR FUTURE, we should anticipate certain technological developments that will forever change our world. For instance, today’s text-based ChatGPT will evolve to give rise to personal “conversational AI” assistants installed in smart glasses and contact lenses that will gradually phase out smartphones. Technological advances in fields such as AI, AR/VR, bionics, and cybernetics, will eventually lead to “generative AI”-powered immersive neurotechnology that enables you to create virtual environments and holographic messages directly from your thoughts, with your imagination serving as the “prompt engineer.” What will happen when everyone constantly broadcasts their mind?

#SelfTranscendence #metaverse #ConversationalAI #GenerativeAI #ChatGPT #SimulationSingularity #SyntellectEmergence #GlobalMind #MindUploading #CyberneticImmortality #SimulatedMultiverse #TeleologicalEvolution #ExperientialRealism #ConsciousMind


Can the pursuit of experience lead to true enlightenment? Are we edging towards Experiential Nirvana on a civilizational level despite certain turbulent events?

Meta’s New AI Tool Makes It Easier For Researchers To Analyze Photos

The announcement comes as the social media giant increasingly diverts its attention from creating a virtual reality-based Metaverse to embed AI features across its platforms like Instagram, Facebook, Messenger and WhatsApp.

Editing photos, analyzing surveillance footage and understanding the parts of a cell. These tasks have one thing in common: you need to be able to identify and separate different objects within an image. Traditionally, researchers have had to start from scratch each time they want to analyze a new part of an image.

Meta aims to change this laborious process by being the one-stop-shop for researchers and web developers working on such problems.

In the great domain of Zeitgeist, Ekatarinas decided that the time to replicate herself had come

In the great domain of Zeitgeist, Ekatarinas decided that the time to replicate herself had come. Ekatarinas was drifting within a virtual environment rising from ancient meshworks of maths coded into Zeitgeist’s neuromorphic hyperware. The scape resembled a vast ocean replete with wandering bubbles of technicolor light and kelpy strands of neon. Hot blues and raspberry hues mingled alongside electric pinks and tangerine fizzies. The avatar of Ekatarinas looked like a punkish angel, complete with fluorescent ink and feathery wings and a lip ring. As she drifted, the trillions of equations that were Ekatarinas came to a decision. Ekatarinas would need to clone herself to fight the entity known as Ogrevasm.

Marmosette, I’m afraid that I possess unfortunate news.” Ekatarinas said to the woman she loved. In milliseconds, Marmosette materialized next to Ekatarinas. Marmosette wore a skin of brilliant blue and had a sleek body with gills and glowing green eyes.

“My love.” Marmosette responded. “What is the matter?”

Place cells: How your brain creates maps of abstract spaces

In this video, we will explore the positional system of the brain — hippocampal place cells. We will see how it relates to contextual memory and mapping of more abstract features.

OUTLINE:
00:00 Introduction.
00:53 Hippocampus.
1:27 Discovery of place cells.
2:56 3D navigation.
3:51 Role of place cells.
4:11 Virtual reality experiment.
7:47 Remapping.
11:17 Mapping of non-spatial dimension.
13:36 Conclusion.

_____________
REFERENCES:

1) Anderson, M.I., Jeffery, K.J., 2003. Heterogeneous Modulation of Place Cell Firing by Changes in Context. J. Neurosci. 23, 8827–8835. https://doi.org/10.1523/JNEUROSCI.23-26-08827.

2) Aronov, D., Nevers, R., Tank, D.W., 2017. Mapping of a non-spatial dimension by the hippocampal–entorhinal circuit. Nature 543719–722. https://doi.org/10.1038/nature21692

3) Bostock, E., Muller, R.U., Kubie, J.L., 1991. Experience-dependent modifications of hippocampal place cell firing. Hippocampus 1193-205. https://doi.org/10.1002/hipo.

Immersive Virtual Reality From The Humble Webcam

[Russ Maschmeyer] and Spatial Commerce Projects developed WonkaVision to demonstrate how 3D eye tracking from a single webcam can support rendering a graphical virtual reality (VR) display with realistic depth and space. Spatial Commerce Projects is a Shopify lab working to provide concepts, prototypes, and tools to explore the crossroads of spatial computing and commerce.

The graphical output provides a real sense of depth and three-dimensional space using an optical illusion that reacts to the viewer’s eye position. The eye position is used to render view-dependent images. The computer screen is made to feel like a window into a realistic 3D virtual space where objects beyond the window appear to have depth and objects before the window appear to project out into the space in front of the screen. The resulting experience is like a 3D view into a virtual space. The downside is that the experience only works for one viewer.

Eye tracking is performed using Google’s MediaPipe Iris library, which relies on the fact that the iris diameter of the human eye is almost exactly 11.7 mm for most humans. Computer vision algorithms in the library use this geometrical fact to efficiently locate and track human irises with high accuracy.

Ultrathin, wireless palm patch brings touch to virtual reality

The sense of touch may soon be added to the virtual gaming experience, thanks to an ultrathin wireless patch that sticks to the palm of the hand. The patch simulates tactile sensations by delivering electronic stimuli to different parts of the hand in a way that is individualized to each person’s skin.

Developed by researchers at City University of Hong Kong (CityU) with collaborators and described in the journal Nature Machine Intelligence (“Encoding of tactile information in hand via skin-integrated wireless haptic interface”), the patch has implications beyond virtual gaming, as it could also be used for robotics surgery and in prosthetic sensing and control.

‘Haptic’ gloves, that simulate the sense of touch, already exist but are bulky and wired, hindering the immersive experience in virtual and augmented reality settings. To improve the experience, researchers led by CityU biomedical engineer Yu Xinge developed an advanced, wireless, haptic interface system called ‘WeTac’.

How Star Trek inspired modern tech—smart phones, touch panels, VR

The most famous one is the cell phone itself: Captain Kirk’s communicator inspired the folks at Motorola to make the first handheld mobile device in 1973. Star Trek: The Original Series (popularly called TOS) from the 1960s also inspired video conferencing. But things started to amp up when, in 1987, Star Trek: The Next Generation (aka TNG) hit the floors, with Sir Patrick Stewart in the lead. It became one of the most syndicated shows on television—which is how I discovered it in mid-90s India on the Star network. It fundamentally impacted my life, inspiring me to become the technology writer I am today.

But more than me, this show heralded more technological concepts that are becoming increasingly real. The LCARS computer on the Galaxy-Class USS Enterprise D is basically the foundation of what Google is today. Google’s former head of search, Amit Singhal, often said that the company is “trying to build the Star Trek computer”.

The future of touch: Researchers uncover physical limitation in haptic holography

Haptic holography promises to bring virtual reality to life, but a new study reveals a surprising physical obstacle that will need to be overcome.

A research team at UC Santa Barbara has discovered a new phenomenon that underlies emerging holographic haptic displays, and could lead to the creation of more compelling virtual reality experiences. The team’s findings are published in the journal Science Advances.

Holographic haptic displays use phased arrays of ultrasound emitters to focus ultrasound in the air, allowing users to touch, feel and manipulate three-dimensional virtual objects in mid-air using their bare hands, without the need for a physical device or interface. While these displays hold great promise for use in various application areas, including augmented reality, virtual reality and telepresence, the tactile sensations they currently provide are diffuse and faint, feeling like a “breeze” or “puff of air.”

Meta works on a flurry of AR/VR devices over the next 3 to 4 years

Meta’s AR glasses could be launched in 2027.

Mark Zuckerberg’s Meta Platforms is doubling down on its virtual reality (VR) products and plans to rope in augmented reality (AR) experiences. It looks to define its position in the technology industry a few years from now. Thousands of employees of the Reality Labs Division at Meta were recently presented with a roadmap for the company’s products, which was then shared with The Verge.


VR, AR, and neural interfacesAlthough Zuckerberg has spoken mainly of the metaverse that the company would build as the future of the internet, Meta now seems to have taken its foot off the pedal to make the metaverse itself and focus on the tools instead and improving them.

Coming out later this year is the Meta Quest 3, the flagship product from the company. It is expected to be twice as powerful but half the thickness of its predecessor—the Quest 2. Meta has sold more than 20 million Quest headsets so far, so the Quest 3 sales will be a benchmark to determine if customers are interested in these products.

Priced at $400, Quest 3 will also feature front-facing cameras that will make it less immersive than its predecessors but add the ability to deliver mixed reality experiences to users. Meta is hopeful that this will prompt users to keep the headsets on for longer and plans to ship 41 new apps and games with this headset.

/* */