World is approaching point where no one can shut down a rogue AI, says director of body behind study
College students with anxiety, depression and eating disorders may be more likely to start and to respond more positively to therapy offered via a digital app compared to referrals to in-person campus clinics, according to a study led by Penn State researchers and published in the journal Nature Human Behaviour.
Globally, an estimated 40% to 60% of college students experience a mental health disorder at some point, and the need for campus counseling services has increased faster than institutions’ capacity to provide these services, according to the researchers.
The research team wanted to see if a proactive intervention using a digital therapy app could effectively treat anxiety disorders, depression and eating disorders, as well as address the increased need for psychological services.
A team of researchers from the Universities of Tübingen, Bayreuth, and Kassel, and the Polish Academy of Sciences has developed a method for precisely controlling the movement of magnetic microparticles based on their size. These suspended particles, known as colloidal particles, range in size from a few tens of nanometers to several micrometers. Controlling them is important for applications such as drug delivery, medical laboratory tests, and the synthesis of new materials. The team’s study has now been published in Physical Review Letters.
The new method involves positioning microparticles above a magnetic layer that is patterned like a chessboard. In previous studies, magnetic transportation of the colloidal particles was limited to a specific height. At this distance, although the magnetic forces appear to balance each other out, the particles move regardless of their size. Therefore, it was not possible to control the particles specifically based on their size.
For years, quantum computers have lived under a huge bubble of hype, promising to revolutionize numerous fields, from medicine and battery design to materials science and cybersecurity. But realizing their potential on any serious practical level will only be possible if large numbers of qubits (the basic units of information) can interact with each other with high precision and flexibility.
One of the main things holding that back is that traditional qubits are fixed in place, meaning they can only talk to their immediate neighbors. But in a new paper published in Nature, scientists describe how they overcame this limitation by using mobile qubits that can be moved around a chip. Lars R. Schreiber at the JARA-FIT Institute for Quantum Information in Germany has also published a News & Views piece in the same journal.
Theories of quantum mechanics predict that some particles can exist in superpositions, which essentially means that they can be in more than one state at once. When a particle’s state is measured, however, this superposition appears to “collapse” into a single outcome; a phenomenon often referred to as the “measurement problem.”
In recent years, various theoretical physicists have tried to explain why and how this collapse happens. This led to the introduction of various models, such as the Continuous Spontaneous Localization (CSL) and Diósi–Penrose models.
Both these models predict that spontaneous quantum collapse would also lead to the emission of faint X-ray radiation. The experimental detection of this radiation would thus provide evidence of these theories’ validity.
A new computational method could dramatically accelerate efforts to map the body’s cells in space, according to a study published in Nature Genetics. Spatial multi-omics technologies—often described as ultra-high-resolution maps of tissues—allow scientists to see not only which genes or proteins are active in a cell, but exactly where that activity occurs. That spatial context is critical for understanding complex organs such as the brain, immune tissues and developing embryos.
Unfortunately, capturing multiple molecular layers at once remains expensive and technically challenging, said David Gate, Ph.D., assistant professor in the Ken and Ruth Davee Department of Neurology’s Division of Behavioral Neurology, who was a co-author of the study.
“In practice, investigators end up with ‘mosaic’ datasets: different slices or batches that each capture only some of the layers, often from different technologies or labs, with batch effects and missing pieces,” said Gate, who also leads the Abrams Research Center on Neurogenomics.
Contemporary artificial intelligence (AI) systems, such as the models underpinning the functioning of ChatGPT, image generators and AI-powered creative tools, draw inspiration from the human brain’s functions and organization. While many of these systems are known to perform remarkably well on specific tasks, they still work independently from the human brain.
Researchers at Princeton University set out to create a flexible electronic system that could be directly embedded with groups of living brain cells to create a hybrid biocomputing platform. The new hybrid device they developed, dubbed 3D-MIND, was introduced in a paper published in Nature Electronics.
“This work started with a growing challenge in modern AI,” Tian-Ming Fu, senior author of the paper, told Tech Xplore. “Today’s systems can do incredible things, but they consume enormous amounts of energy, so much that their power demand is starting to shape real-world infrastructure and raise environmental concerns.
As traditional computer chips reach their physical limits and artificial intelligence demands more energy than ever, University of Missouri researchers are rethinking how computers work by taking cues from the human brain. The timing is critical. Energy use from AI data centers is projected to double by the end of the decade, raising urgent questions about sustainability.
The solution may lie in neuromorphic computing, an approach that reimagines computer hardware to process information more like biological neural networks rather than conventional chips.
“One of the brain’s greatest advantages is its efficiency,” Suchi Guha, a professor of physics in Mizzou’s College of Arts and Science, said. “It performs incredibly complex tasks using about 20 watts of power—roughly the same as an old light bulb. By comparison, today’s computer architecture is extremely energy-intensive.”