Toggle light / dark theme

Get the latest international news and world events from around the world.

Log in for authorized contributors

Low-power ‘microwave brain’ on a chip computes on both ultrafast data and wireless signals

Cornell University researchers have developed a low-power microchip they call a “microwave brain,” the first processor to compute on both ultrafast data signals and wireless communication signals by harnessing the physics of microwaves.

Detailed in the journal Nature Electronics, the processor is the first true microwave neural network and is fully integrated on a silicon microchip. It performs real-time frequency domain computation for tasks like radio signal decoding, radar target tracking and digital data processing, all while consuming less than 200 milliwatts of power.

“Because it’s able to distort in a programmable way across a wide band of frequencies instantaneously, it can be repurposed for several computing tasks,” said lead author Bal Govind, a doctoral student who conducted the research with Maxwell Anderson, also a doctoral student. “It bypasses a large number of signal processing steps that digital computers normally have to do.”

What happens in the brain when it learns something new

Memories of significant learning experiences—like the first time a driver gets a speeding ticket—are sharp, compared to the recollection of everyday events—like what someone ate for dinner two weeks ago. That’s because the human brain is primed to learn from helpful associations.

Carnegie Mellon University researchers have identified specific neural connections that are especially sensitive to this process of learning about causality. The discovery, while seemingly intuitive, could have widespread implications for understanding how humans learn and inform new ways to address learning challenges.

“If you look out the window and see dark clouds, you know that it’s going to rain and that you’ll need an umbrella,” said Eunsol Park, a Ph.D. student in the Department of Biological Sciences and the Center for the Neural Basis of Cognition, a joint program between Carnegie Mellon and the University of Pittsburgh.

Advanced computer modeling predicts molecular-qubit performance

A qubit is the delicate, information-processing heart of a quantum device. In the coming decades, advances in quantum information are expected to give us computers with new, powerful capabilities and detectors that can pick up atomic-scale signals in medicine, navigation and more. The realization of such technologies depends on having reliable, long-lasting qubits.

Now, researchers have taken an important step in understanding the rules necessary for the design of useful, efficient qubits.

Using advanced computer modeling, the researchers came up with a way to accurately predict and fine-tune key magnetic properties of a type of device called a molecular qubit. They also figured out which factors in the material that the qubit sits in affect this tuning the most and calculated how long the qubits can live.

How to Create a Data Analyst Agent

This video shows you how to create a data analyst AI agent using Llama Nemotron Super 1.5 49B, a NAS-optimized, open-weight reasoning model. It tops the Artificial Analysis Intelligence Index in the ~70B range—while fitting on a single NVIDIA H100 or NVIDIA H200 GPU for higher throughput and lower cost.

00:00 — Introduction.
00:39 — What it is.
01:24 — Accuracy.
1:53 — Performance.
02:20 — Open & transparent.
02:59 — Data analysis agent demo.
04:46 — Recap.

Open weights + open dataset on Hugging Face https://huggingface.co/nvidia/Llama-3… from your browser and create API key https://build.nvidia.com/nvidia/llama… Demo repository https://github.com/NVIDIA/GenerativeA… NVIDAI Developer Blog https://developer.nvidia.com/blog/bui… Learn about NVIDIA Nemotron models https://www.nvidia.com/en-us/ai-data–… #LlamaNemotronSuper #NVIDIAAI #OpenSourceAI #GenerativeAI #MachineLearning #DataAnalysis #AIAgents #HuggingFace #GPUComputing #PythonCoding #Pandas #Matplotlib #LLMs #TechDemo.

Try from your browser and create API key https://build.nvidia.com/nvidia/llama

Demo repository https://github.com/NVIDIA/GenerativeA

Folding spacecraft design could be enhanced with origami patterns

Folding spacecraft design could be enhanced with Japan’s ancient origami patterns.


Scientists are exploring a new class of origami structures that could help design and build different shapes for use in space. These structures are expected to be even more compact and reliable.

Called bloom patterns, the new class of origami structures developed at Brigham Young University fold up flat and unfold like flower petals. Researchers expect such designs could be used in telescopes and solar arrays as well.

These structures are suitable for use in spacecraft as origami-based designs could help fold up for launch and then unfold or deploy to their full size when required in space.

/* */