Toggle light / dark theme

Like the better-known prostate-specific antigen (PSA), prostate-specific membrane antigen (PSMA) is a biomarker that can tell physicians much about a patient’s metastatic prostate cancer. PSMA is a protein on the cell surface of most prostate cancers; scanning for it with positron emission tomography (PET) can indicate where in the body prostate cancer has spread, and it can be targeted with a newly approved radioactive therapy. In 15%–20% of patients with castration-resistant prostate cancer, however, PSMA production stops at advanced stages of the disease.

In a new study in the journal Nature Cancer, Dana-Farber Cancer Institute scientists shed new light on the mechanism that raises and lowers PSMA expression in prostate cancer cells. The findings may help physicians select PSMA-targeting therapies for specific patients.

It has long been known that the androgen receptor (AR)—a structure that triggers in response to the hormone androgen—controls the production of PSMA in prostate cancer cells. In the Nature Cancer study, researchers led by Dana-Farber’s Himisha Beltran, MD, and Martin Bakht, Ph.D., found that PSMA expression is lower in liver metastases than in other parts of the body, regardless of expression of the .

IN THE NEAR FUTURE, we should anticipate certain technological developments that will forever change our world. For instance, today’s text-based ChatGPT will evolve to give rise to personal “conversational AI” assistants installed in smart glasses and contact lenses that will gradually phase out smartphones. Technological advances in fields such as AI, AR/VR, bionics, and cybernetics, will eventually lead to “generative AI”-powered immersive neurotechnology that enables you to create virtual environments and holographic messages directly from your thoughts, with your imagination serving as the “prompt engineer.” What will happen when everyone constantly broadcasts their mind?

#SelfTranscendence #metaverse #ConversationalAI #GenerativeAI #ChatGPT #SimulationSingularity #SyntellectEmergence #GlobalMind #MindUploading #CyberneticImmortality #SimulatedMultiverse #TeleologicalEvolution #ExperientialRealism #ConsciousMind


Can the pursuit of experience lead to true enlightenment? Are we edging towards Experiential Nirvana on a civilizational level despite certain turbulent events?

Mixed reality (MR) and Augmented Reality (AR) technologies merge the real world with computer-generated elements, allowing users to interact with their surroundings in more engaging ways. In recent years, these technologies have enhanced education and specialized training in numerous fields, helping trainees to test their skills or make better sense of abstract concepts and data.

Researchers at University of Calgary have been trying to develop interfaces and systems that could enhanced MR visualizations. In a paper set to be presented at CHI 2023 LBW, they introduced HoloTouch, a system that can augment mixed reality graphics and charts using smartphones as physical proxies.

“To me, this paper was inspired for the most part by a work that I published during my final undergraduate year,” Neil Chulpongsatorn, one of the researchers who carried out the study, told Tech Xplore “They both originated from my interest in mixed reality interactions for data representations.”

Holographic receptionists, robots and tea-delivering drones may be part of the workplace in just 30 years, according to new findings.

Employees may soon be spared from carrying out mundane tasks around the office as futuristic technologies blend into our daily lives.

Research conducted by suppliers Furniture At Work claimed that fingerprint-accessible fridges, on-site babysitters and augmented reality (AR) glasses could also be used in 2050 offices.

The sense of touch may soon be added to the virtual gaming experience, thanks to an ultrathin wireless patch that sticks to the palm of the hand. The patch simulates tactile sensations by delivering electronic stimuli to different parts of the hand in a way that is individualized to each person’s skin.

Developed by researchers at City University of Hong Kong (CityU) with collaborators and described in the journal Nature Machine Intelligence (“Encoding of tactile information in hand via skin-integrated wireless haptic interface”), the patch has implications beyond virtual gaming, as it could also be used for robotics surgery and in prosthetic sensing and control.

‘Haptic’ gloves, that simulate the sense of touch, already exist but are bulky and wired, hindering the immersive experience in virtual and augmented reality settings. To improve the experience, researchers led by CityU biomedical engineer Yu Xinge developed an advanced, wireless, haptic interface system called ‘WeTac’.

X-AR uses wireless signals and computer vision to enable users to perceive things that are invisible to the human eye (i.e., to deliver non-line-of-sight perception). It combines new antenna designs, wireless signal processing algorithms, and AI-based fusion of different sensors.

This design introduces three main innovations:

1) AR-conformal wide-band antenna that tightly matches the shape of the AR headset visor and provides the headset with Radio Frequency (RF) sensing capabilities. The antenna is flexible, lightweight, and fits on existing headsets without obstructing any of their cameras or the user’s field of view.

MIT researchers customized a Microsoft Hololens to let them view objects using radio frequencies. If you have ever wanted to possess the superpower of x-ray vision, you may have thought it was only possible if you were from Krypton. However, thanks to a new technology, you may be barking up the wrong tree while thinking about Supergirl, as the reality of seeing through objects lies more with tech, making Iron Man or Batman a more appropriate idea.

Conor russomanno, founder and CEO of openbci eva esteban, embedded software engineer at openbci

Galea is an award-winning platform that merges next-generation biometrics with mixed reality. It is the first device to integrate a wide range of physiological signals, including EEG, EMG, EDA, PPG, and eye-tracking, into a single headset. In this session, Conor and Eva will provide a live demonstration of the device and its capabilities, showcasing its potential for a variety of applications, from gaming to training and rehabilitation. They will give an overview of the different hardware and software components of the system, highlighting how it can be used to analyze user experiences in real time. Attendees will get an opportunity to ask questions at the end.

Haptic holography promises to bring virtual reality to life, but a new study reveals a surprising physical obstacle that will need to be overcome.

A research team at UC Santa Barbara has discovered a new phenomenon that underlies emerging holographic haptic displays, and could lead to the creation of more compelling virtual reality experiences. The team’s findings are published in the journal Science Advances.

Holographic haptic displays use phased arrays of ultrasound emitters to focus ultrasound in the air, allowing users to touch, feel and manipulate three-dimensional virtual objects in mid-air using their bare hands, without the need for a physical device or interface. While these displays hold great promise for use in various application areas, including augmented reality, virtual reality and telepresence, the tactile sensations they currently provide are diffuse and faint, feeling like a “breeze” or “puff of air.”

Meta’s AR glasses could be launched in 2027.

Mark Zuckerberg’s Meta Platforms is doubling down on its virtual reality (VR) products and plans to rope in augmented reality (AR) experiences. It looks to define its position in the technology industry a few years from now. Thousands of employees of the Reality Labs Division at Meta were recently presented with a roadmap for the company’s products, which was then shared with The Verge.


VR, AR, and neural interfacesAlthough Zuckerberg has spoken mainly of the metaverse that the company would build as the future of the internet, Meta now seems to have taken its foot off the pedal to make the metaverse itself and focus on the tools instead and improving them.

Coming out later this year is the Meta Quest 3, the flagship product from the company. It is expected to be twice as powerful but half the thickness of its predecessor—the Quest 2. Meta has sold more than 20 million Quest headsets so far, so the Quest 3 sales will be a benchmark to determine if customers are interested in these products.