Toggle light / dark theme

An interesting paper where Schuette et al. develop a generative diffusion-based AI model for predicting the 3D structure of chromatin. Their model takes chromatin accessibility sequence data as input and outputs a statistical distribution of predicted 3D chromatin structures. Remarkably, their model generalizes across cell types, making it broadly useful! #computationalbiology #ai #generativeai


Computational approaches for predicting chromatin conformations de novo using only sequencing data remain scarce. Compared to existing polymer simulation–based prediction approaches, ChromoGen maintains unique advantages. The generative nature of ChromoGen enables efficient production of statistically independent samples, thus avoiding the inefficient navigation of state space that polymer simulations require to produce a diverse set of conformations. Moreover, ChromoGen’s transformer-based front end provides additional advantages, extracting features from sequencing data and placing the information in low-dimensional embeddings that the diffusion model handles efficiently. This powerful design markedly reduces the computational cost of each diffusion step, providing a practical means to achieve cell type–specific de novo predictions with the full benefit of DNA sequence and chromatin accessibility data. In contrast, incorporating DNA sequence information into polymer models has long been a challenging task that is often indirectly addressed by incorporating various histone marks.

In its current form, ChromoGen can be immediately applied to any cell type with available DNAse-seq data, enabling a vast number of studies into the heterogeneity of genome organization both within and between cell types to proceed. However, several improvements could enhance its utility. Notably, the current model exclusively predicts chromatin conformations in 1.28-Mb regions at 20-kb resolution, the latter restriction primarily stemming from our decision to maximize resolution within the constraints imposed by the available Dip-C data. However, higher-resolution single-cell datasets are becoming available, such as those at 5-kb resolution (50), and we anticipate that ChromoGen will require no modifications to perform well after training on these improved datasets. Similarly, we anticipate that ChromoGen can be directly applied to longer genomic regions if using a lower resolution, e.g.

DARPA’s Intensity-Squeezed Photonic Integration for Revolutionary Detectors (INSPIRED) seeks to break the quantum noise limit

Optical detectors are essential for converting light into measurable signals, enabling a wide range of critical technologies, such as fiber-optic communication, biological imaging, and motion sensors for navigation. However, their sensitivity is fundamentally limited by quantum noise, which prevents the detection of extremely faint signals in the most precision-demanding fields.

As the world marks the 100-year anniversary of the initial development of quantum mechanics with the International Year of Quantum Science and Technology, DARPA’s Intensity-Squeezed Photonic Integration for Revolutionary Detectors (INSPIRED) program is working to break through the quantum noise limit. By harnessing “squeezed light,” INSPIRED seeks to develop compact, cost-effective optical detectors that can operate at unprecedented sensitivities – allowing signals previously buried in quantum noise to be clearly detected.

For decades, end users and systems designers have valued radar technology for its reliability. Especially in adverse weather conditions in which sensors based on other modalities are apt to fail, radar is a dependable technique offering broad application potential.

As a result of this robustness and widespread applicability, radar today is established as a standard sensing system in several high-growth technology sectors. The automotive industry, for example, has been a key driver of radar sensor miniaturization and overall performance improvements. The commercialization of radar for passenger vehicles predates the turn of the century, and radar sensors are also now commonly deployed in advanced driver-assistance systems, including for adaptive cruise control, autonomous emergency braking, and blind-spot assist.

A European company will give its cargo-return tech its first in-space test this spring, if all goes according to plan.

Germany’s Atmos Space Cargo announced today (Feb. 5) that its first Phoenix reentry capsule will fly on SpaceX’s Bandwagon 3 rideshare mission. A Falcon 9 rocket will launch Bandwagon 3 no earlier than April, according to Atmos representatives.

This is superlongevity! ♾️

“One shark, measuring five meters, was found to be at least 272 years old, with an upper age estimate of more than 500 years (392 +/- 120 years). Another specimen was at least 260 years old, potentially exceeding 400 years. “We definitely expected the sharks to be old, but we didn’t expect that it would be the longest-living vertebrate animal,” Nielsen said.”


The Greenland shark holds the title as the longest-lived vertebrate on Earth, with some individuals potentially reaching 500 years of age. This elusive deep-sea predator, found in the frigid waters of the North Atlantic and Arctic Oceans, has fascinated scientists due to its remarkable lifespan. Its slow growth rate and mysterious biology have made it a subject of ongoing research, shedding light on how some species defy the limits of aging.

A major breakthrough in understanding the longevity of Greenland sharks came from a research team led by Julius Nielsen, a marine biologist at the University of Copenhagen. Nielsen and his colleagues conducted a study that revealed a Greenland shark estimated to be at least 272 years old, with some models suggesting an upper age limit of nearly 500 years.

AI transformational impact is well under way. But as AI technologies develop, so too does their power consumption. Further advancements will require AI chips that can process AI calculations with high energy efficiency. This is where spintronic devices enter the equation. Their integrated memory and computing capabilities mimic the human brain, and they can serve as a building block for lower-power AI chips.

Now, researchers at Tohoku University, National Institute for Materials Science, and Japan Atomic Energy Agency have developed a new spintronic device that allows for the electrical mutual control of non-collinear antiferromagnets and ferromagnets. This means the device can switch magnetic states efficiently, storing and processing information with less energy—just like a brain-like AI chip.

The breakthrough can potentially revolutionize AI hardware via high efficiency and low energy costs. The team published their results in Nature Communications on February 5, 2025.