Toggle light / dark theme

Detecting early-stage tumors with a blood sample

Current methods for cancer diagnosis are based on identifying biomarkers — molecules that reveal a particular state or process in the body – produced by the tumor or associated proteins. Not surprisingly, these markers are more abundant once the tumor has already developed significantly. And the more advanced the tumor, the more difficult it is to find effective treatment options.

Now, a research team has developed a test that can detect early-stage solid tumors with just a blood sample. In addition, the test also provides information relevant to the choice of treatment.

To achieve this early detection, the team focused the test not on the markers produced by the tumor, but on the body’s defensive reaction to the cancer. Since the 19th century it has been known that the emergence of cancer cells causes changes in the immune system, and it was also known that these changes are more intense in cancer’s earliest stages. But they had never been used for diagnosis. The new study focuses on them, specifically on the changes in blood proteins derived from cancer’s disruption of the immune system.

But this approach posed a problem to the team: human blood contains more than 5,000 proteins, which makes it extremely difficult to analyze. So they used bioinformatics analysis and narrowed the scope of the study to five amino acids: lysine, tryptophan, tyrosine, cysteine and cysteine not bound to disulphide bonds.

They then subjected the sample to reactions that emit fluorescence when light is applied to them — fluorogenic reactions — and revealed the exact concentration of each of these amino acids in the plasma. Using the artificial intelligence tool machine learning, they identified patterns in these concentrations that could be translated into diagnostic signals.

As they explain in the published article, they applied this technique to samples from 170 patients and were able to identify 78% of cancers with a 0% false positive rate.

Chinese researchers unveil world’s largest-scale brain-like computer Darwin Monkey

Chinese researchers unveiled on Saturday a new generation of super large-scale brain-like computer, Darwin Monkey, the world’s first neuromorphic brain-like computer based on dedicated neuromorphic chips with over 2 billion neurons, which can mimic the workings of a macaque monkey’s brain.

Developed by the State Key Laboratory of Brain-Machine Intelligence at Zhejiang University in East China’s Zhejiang Province, Darwin Monkey, also known as Wukong supports over 2 billion spiking neurons and more than 100 billion synapses, with a neuron count approaching that of a macaque brain. It consumes approximately 2,000 watts of power under typical operating conditions, the Science and Technology Daily reported.

The human brain is like an extremely efficient “computer.” Brain-inspired computing applies the working principles of biological neural networks to computer system design, aiming to build computing systems that, like the brain, feature low power consumption, high parallelism, high efficiency, and intelligence.

Gaussian processes provide a new path toward quantum machine learning

Neural networks revolutionized machine learning for classical computers: self-driving cars, language translation and even artificial intelligence software were all made possible. It is no wonder, then, that researchers wanted to transfer this same power to quantum computers—but all attempts to do so brought unforeseen problems.

Recently, however, a team at Los Alamos National Laboratory developed a new way to bring these same to quantum computers by leveraging something called the Gaussian process.

“Our goal for this project was to see if we could prove that genuine quantum Gaussian processes exist,” said Marco Cerezo, the Los Alamos team’s lead scientist. “Such a result would spur innovations and new forms of performing quantum .”

New MIT Device Could Be Key to Faster, More Energy-Efficient Computing and Communications

Solves major problems associated with integrating electronics, photonics in microchip systems.

The MIT device in the green callout could be key to faster, more energy-efficient data communication. It solves a major problem associated with packaging an electrical chip (black, center) with photonic chips (the eight surrounding squares). This image also shows an automated tool placing the final photonic chip into position. Image courtesy Drew Weninger, MIT.

The future of digital computing and communications will involve both electronics—manipulating data with electricity—and photonics, or doing the same with light. Together the two could allow exponentially more data traffic across the globe in a process that is also more energy efficient.

“The bottom line is that integrating photonics with electronics in the same package is the transistor for the 21st century. If we can’t figure out how to do that, then we’re not going to be able to scale forward,” says Lionel Kimerling, the Thomas Lord Professor of Materials Science and Engineering at MIT and director of the MIT Microphotonics Center.

Enter FUTUR-IC, a new research team based at MIT and funded by the National Science Foundation’s Convergence Accelerator through a cooperative agreement. “Our goal is to build a microchip industry value chain that is resource-efficient,” says Anu Agarwal, head of FUTUR-IC and a principal research scientist at the Materials Research Laboratory (MRL).

Psychology research reveals how the brain constructs emotional experiences

Arousal—how alert or excited one feels—is a basic part of emotions, along with whether those emotions are positive or negative. Scientists still don’t fully understand how the brain creates these feelings of arousal, or if the brain uses the same or different systems for emotional arousal compared to states such as being awake or having a bodily reaction.

A recent study led by Professor Benjamin Becker from the Department of Psychology at The University of Hong Kong (HKU) and published in Nature Communications uncovers a brain that reveals how is consciously experienced—and whether this experience is distinct from automatic bodily reactions.

Using a powerful combination of AI-driven modeling, advanced brain imaging, and close-to-real-life experimental paradigms, the team was able to uncover a brain signature that precisely measures emotional intensity (arousal) across diverse situations ranging from seeing a loved one to watching a horror movie. Notably, the team was able to disentangle the conscious emotional experience from automatic physiological responses such as sweating or heart racing.

Demis Hassabis on our AI future: ‘It’ll be 10 times bigger than the Industrial Revolution — and maybe 10 times faster’

The head of Google’s DeepMind says artificial intelligence could usher in an era of ‘incredible productivity’ and ‘radical abundance’. But who will it benefit? And why does he wish the tech giants had moved more slowly?

Apple CEO: AI Is ‘As Big or Bigger’ Than the Internet, Smartphones

Apple CEO Tim Cook told employees at an all-hands meeting that the AI revolution is “as big or bigger” than the internet, smartphones, cloud computing, and apps. According to Bloomberg’s Power On newsletter, Cook said, “Apple must do this,” adding that this is “ours to grab.” He expressed hopes that, though Apple has been relatively late in rolling out AI tools—Apple Intelligence was only unveiled in 2024 —it could still dominate its rivals.

“We’ve rarely been first,” the CEO told staff. “There was a PC before the Mac; there was a smartphone before the iPhone; there were many tablets before the iPad; there was an MP3 player before iPod.”

But Cook argued that Apple invented the “modern” versions of those products, adding: “This is how I feel about AI.” He also discussed practical steps Apple is taking to make these plans a reality. Cook said Apple is investing in AI in a “big way,” and that 40% of the 12,000 employees hired last year are set to work on research and development.

/* */