Toggle light / dark theme

Chan Zuckerberg Initiative’s rBio uses virtual cells to train AI, bypassing lab work

The Chan Zuckerberg Initiative announced Thursday the launch of rBio, the first artificial intelligence model trained to reason about cellular biology using virtual simulations rather than requiring expensive laboratory experiments — a breakthrough that could dramatically accelerate biomedical research and drug discovery.

The reasoning model, detailed in a research paper published on bioRxiv, demonstrates a novel approach called “soft verification” that uses predictions from virtual cell models as training signals instead of relying solely on experimental data. This paradigm shift could help researchers test biological hypotheses computationally before committing time and resources to costly laboratory work.

Grok answers my questions about what Elon meant when he said Tesla FSD v14 will seem sentient

Questions to inspire discussion.

Advanced Navigation and Obstacle Recognition.

🛣️ Q: How will FSD v14 handle unique driveway features? A: The improved neural net and higher resolution video processing will help FSD v14 better recognize and navigate features like speed bumps and humps, adjusting speed and steering smoothly based on their shape and height.

🚧 Q: What improvements are expected in distinguishing real obstacles? A: Enhanced object detection driven by improved algorithms and higher resolution video inputs will make FSD v14 better at distinguishing real obstacles from false positives like tire marks, avoiding abrupt breaking and overreacting.

Edge case handling and smooth operation.

🧩 Q: How will FSD v14 handle complex edge cases? A: The massive jump in parameter count and better video compression will help the AI better understand edge cases, allowing it to reason that non-threatening objects like a stationary hatch in the road aren’t obstacles, maintaining smooth cruising.

How the brain constructs emotional experiences

Using a powerful combination of AI-driven modelling, advanced brain imaging, and close-to-real-life experimental paradigms, the team was able to uncover a brain signature that precisely measures emotional intensity (arousal) across diverse situations ranging from seeing a loved one to a horror movie. Notably, the team was able to disentangle the conscious emotional experience from the automatic physiological responses such as sweating to heart racing.

Beyond the theoretical implications, this discovery opens new avenues for:

Low-power ‘microwave brain’ on a chip computes on both ultrafast data and wireless signals

Cornell University researchers have developed a low-power microchip they call a “microwave brain,” the first processor to compute on both ultrafast data signals and wireless communication signals by harnessing the physics of microwaves.

Detailed in the journal Nature Electronics, the processor is the first true microwave neural network and is fully integrated on a silicon microchip. It performs real-time frequency domain computation for tasks like radio signal decoding, radar target tracking and digital data processing, all while consuming less than 200 milliwatts of power.

“Because it’s able to distort in a programmable way across a wide band of frequencies instantaneously, it can be repurposed for several computing tasks,” said lead author Bal Govind, a doctoral student who conducted the research with Maxwell Anderson, also a doctoral student. “It bypasses a large number of signal processing steps that digital computers normally have to do.”

How to Create a Data Analyst Agent

This video shows you how to create a data analyst AI agent using Llama Nemotron Super 1.5 49B, a NAS-optimized, open-weight reasoning model. It tops the Artificial Analysis Intelligence Index in the ~70B range—while fitting on a single NVIDIA H100 or NVIDIA H200 GPU for higher throughput and lower cost.

00:00 — Introduction.
00:39 — What it is.
01:24 — Accuracy.
1:53 — Performance.
02:20 — Open & transparent.
02:59 — Data analysis agent demo.
04:46 — Recap.

Open weights + open dataset on Hugging Face https://huggingface.co/nvidia/Llama-3… from your browser and create API key https://build.nvidia.com/nvidia/llama… Demo repository https://github.com/NVIDIA/GenerativeA… NVIDAI Developer Blog https://developer.nvidia.com/blog/bui… Learn about NVIDIA Nemotron models https://www.nvidia.com/en-us/ai-data–… #LlamaNemotronSuper #NVIDIAAI #OpenSourceAI #GenerativeAI #MachineLearning #DataAnalysis #AIAgents #HuggingFace #GPUComputing #PythonCoding #Pandas #Matplotlib #LLMs #TechDemo.

Try from your browser and create API key https://build.nvidia.com/nvidia/llama

Demo repository https://github.com/NVIDIA/GenerativeA

AI system slashes GPS errors almost 40 times in urban settings

End of the road for GPS? Image-based AI tool cuts navigation errors by nearly 40 times.


Global reliance on GPS has grown to the point where outages or signal interference can have serious consequences. From logistics and delivery fleets to aviation and defense, industries depend on precise navigation.

Yet GPS signals are vulnerable. Tall buildings, tunnels, jamming devices, or even atmospheric conditions can cause disruptions. By providing a GPS-free alternative, PEnG could make navigation more resilient and reliable.

The Surrey researchers argue that their work is as much about everyday convenience as it is about safety.

Optimizing how cells self-organize

One of the most fundamental processes in all of biology is the spontaneous organization of cells into clusters that divide and eventually turn into shapes – be they organs, wings or limbs.

Scientists have long explored this enormously complex process to make artificial organs or understand cancer growth – but precisely engineering single cells to achieve a desired collective outcome is often a trial-and-error process.

Harvard applied physicists consider the control of cellular organization and morphogenesis to be an optimization problem that can be solved with powerful new machine learning tools. In new research published in Nature Computational Science, researchers in the John A. Paulson School of Engineering and Applied Sciences (SEAS) have created a computational framework that can extract the rules that cells need to follow as they grow, in order for a collective function to emerge from the whole.

/* */