Toggle light / dark theme

Finding information in the randomness of living matter

When describing collective properties of macroscopic physical systems, microscopic fluctuations are typically averaged out, leaving a description of the typical behavior of the systems. While this simplification has its advantages, it fails to capture the important role of fluctuations that can often influence the dynamics in dramatic manners, as the extreme examples of catastrophic events such as volcanic eruptions and financial market collapse reveal.

On the other hand, studying the dynamics of individual microscopic degrees of freedom comprehensively becomes too cumbersome even when considering systems of a moderate number of particles. To describe the interface between these opposite ends of the scale, stochastic field theories are commonly used to characterize the dynamics of complex systems and the effect of the microscopic fluctuations.

Due to their overwhelming complexity, predicting outcomes by analyzing these fluctuations in living or active matter systems is not possible using traditional methods of physics. Since these systems persistently consume energy, they exhibit dynamical traits that violate the laws of equilibrium thermodynamics, not unrelated to the arrow of time.

Diamond quantum sensors improve spatial resolution of MRI

This accomplishment breaks the previous record of 48 qubits set by Jülich scientists in 2019 on Japan’s K computer. The new result highlights the extraordinary capabilities of JUPITER and provides a powerful testbed for exploring and validating quantum algorithms.

Simulating quantum computers is essential for advancing future quantum technologies. These simulations let researchers check experimental findings and experiment with new algorithmic approaches long before quantum hardware becomes advanced enough to run them directly. Key examples include the Variational Quantum Eigensolver (VQE), which can analyze molecules and materials, and the Quantum Approximate Optimization Algorithm (QAOA), used to improve decision-making in fields such as logistics, finance, and artificial intelligence.

Recreating a quantum computer on conventional systems is extremely demanding. As the number of qubits grows, the number of possible quantum states rises at an exponential rate. Each added qubit doubles the amount of computing power and memory required.

Although a typical laptop can still simulate around 30 qubits, reaching 50 qubits requires about 2 petabytes of memory, which is roughly two million gigabytes. ‘Only the world’s largest supercomputers currently offer that much,’ says Prof. Kristel Michielsen, Director at the Jülich Supercomputing Centre. ‘This use case illustrates how closely progress in high-performance computing and quantum research are intertwined today.’

The simulation replicates the intricate quantum physics of a real processor in full detail. Every operation – such as applying a quantum gate – affects more than 2 quadrillion complex numerical values, a ‘2’ with 15 zeros. These values must be synchronized across thousands of computing nodes in order to precisely replicate the functioning of a real quantum processor.


The JUPITER supercomputer set a new milestone by simulating 50 qubits. New memory and compression innovations made this breakthrough possible. A team from the Jülich Supercomputing Centre, working with NVIDIA specialists, has achieved a major milestone in quantum research. For the first time, they successfully simulated a universal quantum computer with 50 qubits, using JUPITER, Europe’s first exascale supercomputer, which began operation at Forschungszentrum Jülich in September.

Real-estate finance services giant SitusAMC breach exposes client data

SitusAMC, a company that provides back-end services for top banks and lenders, disclosed on Saturday a data breach it had discovered earlier this month that impacted customer data.

As a real-estate (commercial and residential) financing firm, SitusAMC handles back-office operations in areas like mortgage origination, servicing, and compliance for banks and investors.

The company generates around $1 billion in annual revenue from 1,500 clients, some of whom are banking giants like Citi, Morgan Stanley, and JPMorgan Chase.

From Generative To Agentic: The New Era Of AI Autonomy In 2026

#artificialintelligence


Agentic AI is a form of artificial intelligence that does more than just generate; it will act, reason somewhat, collaborate, and execute on its own. Agentic AI transforms its role from a limited tool to that of a collaborative coworker.

This shift affects various sectors, including cybersecurity, national defense, healthcare, key infrastructure, finance, supply chains, and corporate automation. Additionally, it accelerates the integration of robotics, neuromorphic systems, sensor-driven edge computing, and artificial intelligence.

Systems with the ability to plan and pursue goals characterize Agentic AI. IIt combines APIs and tools, engages with dynamic environments, makes decisions, uses reasoning, and continues to learn and adapt.

Python-Based WhatsApp Worm Spreads Eternidade Stealer Across Brazilian Devices

Cybersecurity researchers have disclosed details of a new campaign that leverages a combination of social engineering and WhatsApp hijacking to distribute a Delphi-based banking trojan named Eternidade Stealer as part of attacks targeting users in Brazil.

“It uses Internet Message Access Protocol (IMAP) to dynamically retrieve command-and-control (C2) addresses, allowing the threat actor to update its C2 server,” Trustwave SpiderLabs researchers Nathaniel Morales, John Basmayor, and Nikita Kazymirskyi said in a technical breakdown of the campaign shared with The Hacker News.

It is distributed through a WhatsApp worm campaign, with the actor now deploying a Python script, a shift from previous PowerShell-based scripts to hijack WhatsApp and spread malicious attachments.

Quantum computers could be powerful enough to decrypt Bitcoin sometime after 2030, CEO of Nvidia’s quantum partner says

“You should have a few good years ahead of you but I wouldn’t hold my Bitcoin,” Peronnin said, laughing. “They need to fork [move to a stronger blockchain] by 2030, basically. Quantum computers will be ready to be a threat a bit later than that,” he said.

Quantum doesn’t just threaten Bitcoin, of course, but all banking encryption. And it is likely that in all these cases companies are developing quantum resistant tools to upgrade their existing security systems.

Defensive security algorithms are improving, Peronnin said, so it’s not certain when the blockchain will become vulnerable to a quantum attack. But “the threshold for such an event is coming closer to us year by year,” he said.

First full simulation of 50 qubit universal quantum computer achieved

A research team at the Jülich Supercomputing Center, together with experts from NVIDIA, has set a new record in quantum simulation: for the first time, a universal quantum computer with 50 qubits has been fully simulated—a feat achieved on Europe’s first exascale supercomputer, JUPITER, inaugurated at Forschungszentrum Jülich in September.

The result surpasses the previous world record of 48 qubits, established by Jülich researchers in 2022 on Japan’s K computer. It showcases the immense computational power of JUPITER and opens new horizons for developing and testing quantum algorithms. The research is published on the arXiv preprint server.

Quantum computer simulations are vital for developing future quantum systems. They allow researchers to verify experimental results and test new algorithms long before powerful quantum machines become reality. Among these are the Variational Quantum Eigensolver (VQE), which can model molecules and materials, and the Quantum Approximate Optimization Algorithm (QAOA), used for optimization problems in logistics, finance, and artificial intelligence.

MAKER: Large Language Models (LLMs) have achieved remarkable breakthroughs in reasoning, insight generation, and tool use

They can plan multi-step actions, generate creative solutions, and assist in complex decision-making. Yet these strengths fade when tasks stretch over long, dependent sequences. Even small per-step error rates compound quickly, turning an impressive short-term performance into complete long-term failure.

That fragility poses a fundamental obstacle for real-world systems. Most large-scale human and organizational processes – from manufacturing and logistics to finance, healthcare, and governance – depend on millions of actions executed precisely and in order. A single mistake can cascade through an entire pipeline. For AI to become a reliable participant in such processes, it must do more than reason well. It must maintain flawless execution over time, sustaining accuracy across millions of interdependent steps.

Apple’s recent study, The Illusion of Thinking, captured this challenge vividly. Researchers tested advanced reasoning models such as Claude 3.7 Thinking and DeepSeek-R1 on structured puzzles like Towers of Hanoi, where each additional disk doubles the number of required moves. The results revealed a sharp reliability cliff: models performed perfectly on simple problems but failed completely once the task crossed about eight disks, even when token budgets were sufficient. In short, more “thinking” led to less consistent reasoning.

Drift logs destroy intertidal ecosystems, study shows

Logs are a familiar sight on the beaches along the coast of Vancouver Island and Haida Gwaii and are often viewed positively, as they can stabilize the banks, be used for firewood or act as benches by beach-goers. However, new research from the University of Victoria (UVic) shows that these logs are not as innocuous as they seem.

According to a study published by UVic biologist Tom Reimchen and two of his students, free-floating logs that wash ashore, referred to as logs, are causing widespread destruction of rocky intertidal ecosystems communities along the coast of Western Canada. The paper is published in the journal Marine Ecology.

“In this study, we looked at both the ecological impact of drift logs, and at log abundance and movement over time,” says Reimchen. “Both aspects of the study had worrisome results.”

Degradation and Failure Phenomena at the Dentin Bonding Interface

Damage in the bonding interface is a significant factor that leads to premature failure of dental bonded restorations. The imperfectly bonded dentin-adhesive interface is susceptible to hydrolytic degradation and bacterial and enzyme attack, severely jeopardizing restorations’ longevity. Developing caries around previously made restorations, also called “recurrent or secondary caries,” is a significant health problem. The replacement of restorations is the most prevailing treatment in dental clinics, leading to the so-called “tooth death spiral”. In other words, every time a restoration is replaced, more tooth tissue is removed, increasing the size of the restorations until the tooth is eventually lost. This process leads to high financial costs and detriment to patients’ quality of life.

/* */