Menu

Blog

Page 3

Jul 27, 2024

‘Kink state’ control may provide pathway to quantum electronics

Posted by in categories: electronics, quantum physics

The key to developing quantum electronics may have a few kinks. According to a team led by researchers at Penn State, that’s not a bad thing when it comes to the precise control needed to fabricate and operate such devices, including advanced sensors and lasers.

Jul 27, 2024

Black Holes Can’t Be Created by Light

Posted by in categories: climatology, cosmology, quantum physics

The formation of a black hole from light alone is permitted by general relativity, but a new study says quantum physics rules it out.

Black holes are known to form from large concentrations of mass, such as burned-out stars. But according to general relativity, they can also form from ultra-intense light. Theorists have speculated about this idea for decades. However, calculations by a team of researchers now suggest that light-induced black holes are not possible after all because quantum-mechanical effects cause too much leakage of energy for the collapse to proceed [1].

The extreme density of mass produced by a collapsed star can curve spacetime so severely that no light entering the region can escape. The formation of a black hole from light is possible according to general relativity because mass and energy are equivalent, so the energy in an electromagnetic field can also curve spacetime [2]. Putative electromagnetic black holes have become popularly known as kugelblitze, German for “ball lightning,” following the terminology used by Princeton University physicist John Wheeler in early studies of electromagnetically generated gravitational fields in the 1950s [3].

Jul 27, 2024

Iterative Process Builds Near-Perfect Atom Array

Posted by in categories: computing, particle physics, quantum physics

In most neutral-atom quantum computers, atoms are held in arrays of optical tweezers. Researchers typically populate the arrays stochastically, meaning that whether a given site receives an atom is down to chance. Atoms can later be rearranged individually, but the total number of atoms depends on the success of the initial loading.

The Atom Computing team developed an iterative process to fill an array to capacity. Instead of filling the array directly, the researchers first stochastically populated a second “reservoir” array. They then transferred atoms one by one from this reservoir to the target array using an optical tweezer. Between each loading step, the researchers imaged both arrays to determine which sites in each array were occupied. This step required temporarily switching off the tweezers and holding the atoms in an optical lattice formed from interfering laser beams.

The researchers showed that this sequence could be repeated as many times as necessary without losing atoms from the target array. They also showed that they could limit atom loss during the imaging step by enhancing the lattice strength using optical cavities. This enhancement allowed the atoms to be more strongly confined without increasing the optical lattice’s laser-power requirements.

Jul 27, 2024

Occipital-temporal cortical tuning to semantic and affective features of natural images predicts associated behavioral responses

Posted by in category: futurism

The mechanisms of recognition and response to emotional stimuli are not fully understood. Here, the authors reveal tuning to semantic and emotional image features within occipital temporal cortex that efficiently encodes information suited to guiding behavior.

Jul 27, 2024

Study: AI “inbreeding” may cause model collapse for tools like ChatGPT, Microsoft Copilot

Posted by in category: robotics/AI

Study: AI “incest” may cause model collapse for tools like ChatGPT, Microsoft Copilot.


It’s like Game of Thrones, but for artificial intelligence large language models.

Jul 27, 2024

New technology allows robots to survive through self-amputation

Posted by in category: robotics/AI

The technology not only aids in robot survival, it enables dynamic shape-change.

Jul 27, 2024

Models, metaphors and minds

Posted by in categories: biological, computing, information science, life extension, neuroscience

The idea of the brain as a computer is everywhere. So much so we have forgotten it is a model and not the reality. It’s a metaphor that has lead some to believe that in the future they’ll be uploaded to the digital ether and thereby achieve immortality. It’s also a metaphor that garners billions of dollars in research funding every year. Yet researchers argue that when we dig down into our grey matter our biology is anything but algorithmic. And increasingly, critics contend that the model of the brain as computer is sending scientists (and their resources) nowhere fast. Is our attraction to the idea of the brain as computer an accident of current human technology? Can we find a better metaphor that might lead to a new paradigm?

Jul 26, 2024

Why Can’t we Admit Age is a (Biologically) Meaningful Number?

Posted by in categories: biological, biotech/medical, life extension, neuroscience

If there’s one phrase the June 2024 U.S. presidential debate may entirely eliminate from the English vocabulary it’s that age is a meaningless number. Often attributed to boxer Muhammad Ali, who grudgingly retired at age 39, this centuries-old idea has had far-reaching consequences in global politics, as life expectancy more than doubled since the start of the 20th century, and presidents’ ages shifted upwards. We say “age is what we make of it” to ourselves and to policymakers, and think it’s a harmless way to dignify the aged. But how true is it? And if it isn’t true, why would we lie?

For centuries, we have confused our narrative of what aging should be with what its ruthless biology is. Yet pretending that biological age does not matter is at best myopic, and at worst, it’s a dangerous story to our governments, families, and economies. In just 11 years — between 2018 and 2029 — U.S. spending on Social Security and Medicare will more than double, from $1.3 trillion to $2.7 trillion per year. As we age, our odds of getting sick and dying by basically anything go up exponentially. If smoking increases our chances of getting cancer by a factor of 15, aging does so 100-fold. At age 65, less than 5% of people are diagnosed with Alzheimer’s. Beyond age 85, nearly half the population has some form of dementia. Biological aging is the biggest risk factor for most chronic diseases; it’s a neglected factor in global pandemics; and it even plays a role in rare diseases.

This explains why in hospitals, if there’s one marker next to a patient’s name, it’s their age. How many birthday candles we have blown out is an archaic surrogate marker of biological aging. Yet it’s the best we have. Chronological age is so telling of overall health that physicians everywhere rely on it for life-or-death decisions, from evaluating the risks of cancer screening to rationing hospital beds.

Jul 26, 2024

Fusion Sparks an Energy Revolution

Posted by in category: energy

After hitting a power-output milestone, fusion technology is ready to graduate from small-scale lab experiment to full-sized power plant.

Jul 26, 2024

Unlock Gene Networks Using Limited Data with AI Model Geneformer

Posted by in categories: biotech/medical, genetics, robotics/AI

Geneformer is a recently introduced and powerful AI model that learns gene network dynamics and interactions using transfer learning from vast single-cell transcriptome data. This tool enables researchers to make accurate predictions about gene behavior and disease mechanisms even with limited data, accelerating drug target discovery and advancing understanding of complex genetic networks in various biological contexts.

Developed by researchers at the Broad Institute of MIT and Harvard and their collaborators, the AI model Geneformer uses the highest-expressed genes in sc-RNA expression data to generate a dense representation of each cell, which can be used as features for various downstream predictive tasks. What makes Geneformer unique, however, are the capabilities its architecture enables, even when trained on very little data.

Geneformer has a BERT-like transformer architecture and was pre-trained on data from about 30M single-cell transcriptomes across various human tissues. Its attention mechanism enables it to focus on the most relevant parts of the input data. With this context-aware approach, the model can make predictions by considering ‌relationships and dependencies between genes.

Page 3 of 11,51112345678Last