Toggle light / dark theme

The electrically readable complex dynamics of robust and scalable magnetic tunnel junctions (MTJs) offer promising opportunities for advancing neuromorphic computing. In this work, we present an MTJ design with a free layer and two polarizers capable of computing the sigmoidal activation function and its gradient at the device level. This design enables both feedforward and backpropagation computations within a single device, extending neuromorphic computing frameworks previously explored in the literature by introducing the ability to perform backpropagation directly in hardware. Our algorithm implementation reveals two key findings: (i) the small discrepancies between the MTJ-generated curves and the exact software-generated curves have a negligible impact on the performance of the backpropagation algorithm, (ii) the device implementation is highly robust to inter-device variation and noise, and (iii) the proposed method effectively supports transfer learning and knowledge distillation. To demonstrate this, we evaluated the performance of an edge computing network using weights from a software-trained model implemented with our MTJ design. The results show a minimal loss of accuracy of only 0.4% for the Fashion MNIST dataset and 1.7% for the CIFAR-100 dataset compared to the original software implementation. These results highlight the potential of our MTJ design for compact, hardware-based neural networks in edge computing applications, particularly for transfer learning.

Quantum systems hold the promise of tackling some complex problems faster and more efficiently than classical computers. Despite their potential, so far only a limited number of studies have conclusively demonstrated that quantum computers can outperform classical computers on specific tasks. Most of these studies focused on tasks that involve advanced computations, simulations or optimization, which can be difficult for non-experts to grasp.

Researchers at the University of Oxford and the University of Sevilla recently demonstrated a over a classical scenario on a cooperation task called the odd-cycle game. Their paper, published in Physical Review Letters, shows that a team with can win this game more often than a team without.

“There is a lot of talk about quantum advantage and how will revolutionize entire industries, but if you look closely, in many cases, there is no mathematical proof that classical methods definitely cannot find solutions as efficiently as quantum algorithms,” Peter Drmota, first author of the paper, told Phys.org.

In today’s AI news, believe it or not AI is alive and well, and it’s clearly going to change a lot of things forever. My personal epiphany happened just the other day, while I was “vibe coding” a personal software project. Those of us who have never written a line of code in our lives, but create software programs and applications using AI tools like Bolt or Lovable are called vibe coders.

S how these tools improve automation, multi-agent collaboration, and workflow orchestration for developers. Before we dig into what Then, Anthropic’s CEO Dario Amodei is worried that spies, likely from China, are getting their hands on costly “algorithmic secrets” from the U.S.’s top AI companies — and he wants the U.S. government to step in. Speaking at a Council on Foreign Relations event on Monday, Amodei said that China is known for its “large-scale industrial espionage” and that AI companies like Anthropic are almost certainly being targeted.

Meanwhile, despite all the hype, very few people have had a chance to use Manus. Currently, under 1% of the users on the wait list have received an invite code. It’s unclear how many people are on this list, but for a sense of how much interest there is, Manus’s Discord channel has more than 186,000 members. MIT Technology Review was able to obtain access to Manus, and they gave it a test-drive.

In videos, join Palantir CEO Alexander Karp with New York Times DealBook creator Andrew Ross Sorkin on the promises and peril of Silicon Valley, tech’s changing relationship with Washington, and what it means for our future — and his new book, The Technological Republic. Named “Best CEO of 2024” by The Economist, Alexander Karp is a vital player in Silicon Valley as the CEO of Palantir.

Then, Piers Linney, Co-founder of Implement AI, discusses how artificial intelligence and automation can be maximized across businesses on CNBC International Live. Linney says AI poses a threat to the highest income knowledge workers around the world.

Meanwhile, Nate B. Jones is back with some commentary on how OpenAI has launched a new API aimed at helping developers build AI agents, but its strategic impact remains unclear. While enterprises with strong LLM expertise are already using tools like LangChain effectively, smaller teams struggle with agent complexity. Nate says, despite being a high-quality API, it lacks a distinct differentiator beyond OpenAI’s own ecosystem.

We close out with, Celestial AI CEO Dave Lazovsky outlines how their “Photonic Fabric” technology helps to scale AI as the company raises $250 million in their latest funding round, valuing the company at $2.5 billion. Thats all for today, but AI is moving fast — subscribe.

The article presents an equation of state (EoS) for fluid and solid phases using artificial neural networks. This EoS accurately models thermophysical properties and predicts phaseions, including the critical and triple points. This approach offers a unified way to understand different states of matter.

A team from Princeton University has successfully used artificial intelligence (AI) to solve equations that control the quantum behavior of individual atoms and molecules to replicate the early stages of ice formation. The simulation shows how water molecules transition into solid ice with quantum accuracy.

Roberto Car, Princeton’s Ralph W. *31 Dornte Professor in Chemistry, who co-pioneered the approach of simulating molecular behaviors based on the underlying quantum laws more than 35 years ago, said, “In a sense, this is like a dream come true. Our hope then was that eventually, we would be able to study systems like this one. Still, it was impossible without further conceptual development, and that development came via a completely different field, that of artificial intelligence and data science.”

Modeling the early stages of freezing water, the ice nucleation process could increase the precision of climate and weather modeling and other processes like flash-freezing food. The new approach could help track the activity of hundreds of thousands of atoms over thousands of times longer periods, albeit still just fractions of a second, than in early studies.

A new study has been published in Nature Communications, presenting the first comprehensive atlas of allele-specific DNA methylation across 39 primary human cell types. The study was led by Ph.D. student Jonathan Rosenski under the guidance of Prof. Tommy Kaplan from the School of Computer Science and Engineering and Prof. Yuval Dor from the Faculty of Medicine at the Hebrew University of Jerusalem and Hadassah Medical Center.

Using machine learning algorithms and deep whole-genome bisulfite sequencing on freshly isolated and purified cell populations, the study unveils a detailed landscape of genetic and epigenetic regulation that could reshape our understanding of gene expression and disease.

A key focus of the research is the success in identifying differences between the two alleles and, in some cases, demonstrating that these differences result from —meaning that it is not the sequence (genetics) that matters, but rather whether the allele is inherited from the mother or the father. These findings could reshape our understanding of gene expression and disease.

However, as with much of quantum physics, this “language”—the interaction between spins—is extraordinarily complex. While it can be described mathematically, solving the equations exactly is nearly impossible, even for relatively simple chains of just a few spins. Not exactly ideal conditions for turning theory into reality…

A model becomes reality

Researchers at Empa’s nanotech@surfaces laboratory have now developed a method that allows many spins to “talk” to each other in a controlled manner – and that also enables the researchers to “listen” to them, i.e. to understand their interactions. Together with scientists from the International Iberian Nanotechnology Laboratory and the Technical University of Dresden, they were able to precisely create an archetypal chain of electron spins and measure its properties in detail. Their results have now been published in the renowned journal Nature Nanotechnology.

Physicists have long attempted to find a single theory that unites quantum mechanics and general relativity.

This has been very tricky because quantum mechanics focuses on the unpredictable nature of particles at microscopic scales, whereas general relativity explains gravity as the curvature of spacetime caused by massive objects.

The two theories discuss forces existing on different scales. Bianconi employed an interesting approach to deal with this challenge. She proposes an entropic action where, instead of being a fixed background, spacetime works like a quantum operator — acting on quantum states and deciding how they change over time.

NVIDIA may have just revolutionized computing forever with the launch of DIGITS, the world’s first personal AI supercomputer. By harnessing the power of GPU-accelerated deep learning—the same technology that drives top-tier high-performance computing (HPC) clusters—DIGITS shrinks massive supercomputing capabilities into a desktop-friendly system.

This compact yet powerful platform enables data scientists, researchers, and developers to rapidly train, test, and refine complex neural networks using NVIDIA’s state-of-the-art GPUs and software ecosystem. Built for deep learning, machine learning, and big data analytics, DIGITS seamlessly integrates tensor cores, parallel processing, and accelerated computing into a single, plug-and-play solution.

Researchers from the Department of Physics have managed to experimentally develop a new magnetic state: a magneto-ionic vortex or “vortion.” The research, published in Nature Communications, allows for an unprecedented level of control of magnetic properties at the nanoscale and at room temperature, and opens new horizons for the development of advanced magnetic devices.

The use of Big Data has multiplied the energy demand in information technologies. Generally, to store information, systems utilize electric currents to write data, which dissipates power by heating the devices. Controlling magnetic memories with voltage, instead of , can minimize this energy expenditure.

One way to achieve this is by using magneto-ionic materials, which allow for the manipulation of their magnetic properties by adding or removing ions through changes in the polarity of the applied voltage. So far, most studies in this area have focused on continuous films, rather than on controlling properties at the nanometric scale in discrete “bits,” essential for high-density data storage.