How we achieved quantum non-local communication and what it means for consciousness

Lithium, the lightest metal on the periodic table, plays a pivotal role in modern life. Its low weight and high energy density make it ideal for electric vehicles, cellphones, laptops and military technologies where every ounce counts. As demand for lithium skyrockets, concerns about supply and reliability are growing.
To help meet surging demand and possible supply chain problems, scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have developed an innovative membrane technology that efficiently extracts lithium from water. Several team members also hold joint appointments with the Pritzker School of Molecular Engineering (PME) at the University of Chicago.
The findings appear in the journal Advanced Materials.
Growth in 2.5D and 3D packaging solutions has accelerated off-the-board technology and the components which leverage it, both in terms of the fastest digital processors but also in proprietary ASICs and application processors. As high-bandwidth digital channels approach the practical limits of copper interconnects, silicon photonics and on-PCB/in-package optical interconnects may emerge as the next transformative wave of off-the-board technology.
This opinion is shared by insiders within the PCB and packaging side of the industry.
“Off the board technology is growing at an amazing rate, and isn’t being replaced by optical solutions, it’s enabling more optical solutions,” said Joe Dickson, senior VP chip-to-chip reliability and innovation at WUS PCB International. “They are not competition, they are tools to go much farther than we can today.”
Many Chinese semiconductor fab projects failed due to a lack of technical expertise amid overambitious goals: some startups aimed at advanced nodes like 14nm and 7nm without having experienced R&D teams or access to necessary wafer fab equipment. These efforts were often heavily reliant on provincial government funding, with little oversight or industry knowledge, which lead to collapse when finances dried up or scandals emerged. Some fab ventures were plagued by fraud or mismanagement, with executives vanishing or being arrested, sometimes with local officials involved.
To add to problems, U.S. export restrictions since 2019 blocked access of Chinese entities to critical chipmaking equipment required to make chips at 10nm-class nodes and below, effectively halting progress on advanced fabs. In addition, worsening U.S.-China tensions and global market shifts further undercut the viability of many of these projects.
So, let’s go over some of China’s most ambitious fab projects, many of which have fallen into oblivion, or have become a dreaded zombie fab.
PRESS RELEASE — Quantum computers have operated under a significant limitation: they can run only one program at a time. These million-dollar machines demand exclusive use even for the smallest tasks, leaving much of their expensive and fast-running hardware idle and forcing researchers to endure long queues.
Columbia Engineering researchers have developed HyperQ, a novel system that enables multiple users to share a single quantum computer simultaneously through isolated quantum virtual machines (qVMs). This key development brings quantum computing closer to real-world usability—more practical, efficient, and broadly accessible.
“HyperQ brings cloud-style virtualization to quantum computing,” said Jason Nieh, professor of computer science at Columbia Engineering and co-director of the Software Systems Laboratory. “It lets a single machine run multiple programs at once—no interference, no waiting in line.”
A new microchip invented by Scripps Research scientists can reveal how a person’s antibodies interact with viruses—using just a drop of blood. The technology offers researchers faster, clearer insights that could help accelerate vaccine development and antibody discovery.
“This lets us take a quick snapshot of antibodies as they are evolving after a vaccine or pathogen exposure,” says Andrew Ward, professor in the Department of Integrative Structural and Computational Biology at Scripps Research and senior author of the new paper published in Nature Biomedical Engineering on June 3, 2025. “We’ve never been able to do that on this timescale or with such tiny amounts of blood before.”
When someone is infected with a virus, or receives a vaccine, their immune system creates new antibodies to recognize the foreign invader. Some antibodies work well against the pathogen, while others attach to it only weakly. Figuring out exactly which parts of the virus the best antibodies stick to is key information for scientists trying to optimize vaccines, since they want to design vaccines that elicit strong, reliable immune responses.
Research led by Thilo Womelsdorf, professor of psychology and biomedical engineering at the Vanderbilt Brain Institute, could revolutionize how brain-computer interfaces are used to treat disorders of memory and cognition.
The study, “Adaptive reinforcement learning is causally supported by anterior cingulate cortex and striatum,” was published June 10, 2025, in the journal Neuron.
According to researchers, neurologists use electrical brain-computer interfaces (BCIs) to help patients with Parkinson’s disease and spinal cord injuries when drugs and other rehabilitative interventions are not efficient. For these disorders, researchers say brain-computer interfaces have become electroceuticals that substitute pharmaceuticals by directly modulating dysfunctional brain signals.
Shading brings 3D forms to life, beautifully carving out the shape of objects around us. Despite the importance of shading for perception, scientists have long been puzzled about how the brain actually uses it. Researchers from Justus-Liebig-University Giessen and Yale University recently came out with a surprising answer.
Previously, it has been assumed that one interprets shading like a physics-machine, somehow “reverse-engineering” the combination of shape and lighting that would recreate the shading we see. Not only is this extremely challenging for advanced computers, but the visual brain is not designed to solve that sort of problem. So, these researchers decided to start instead by considering what is known about the brain when it first gets signals from the eye.
“In some of the first steps of visual processing, the brain passes the image through a series of ‘edge-detectors,’ essentially tracing it like an etch-a-sketch,” Professor Roland W. Fleming of Giessen explains. “We wondered what shading patterns would look like to a brain that’s searching for lines.” This insight led to an unexpected, but clever short-cut to the shading inference problem.
From a brain chip that enabled a paralyzed patient to move his hand to a Pentagon-sponsored technology designed to restore memories, several exciting technologies have been announced recently that could advance the field of neurology. Here are 10 examples.