Toggle light / dark theme

Engineering researchers at Lehigh University have discovered that sand can actually flow uphill.

The team’s findings were published today in the journal Nature Communications. A corresponding video shows what happens when torque and an is applied to each grain—the grains flow uphill, up walls, and up and down stairs.

“After using equations that describe the flow of granular materials,” says James Gilchrist, the Ruth H. and Sam Madrid Professor of Chemical and Biomolecular Engineering in Lehigh’s P.C. Rossin College of Engineering and Applied Science and one of the authors of the paper, “we were able to conclusively show that these particles were indeed moving like a , except they were flowing uphill.”

A team of researchers in Japan claims to have figured out a way to translate the clucking of chickens with the use of artificial intelligence.

As detailed in a yet-to-be-peer-reviewed preprint, the team led by University of Tokyo professor Adrian David Cheok — who has previously studied sex robots — came up with a “system capable of interpreting various emotional states in chickens, including hunger, fear, anger, contentment, excitement, and distress” by using “cutting-edge AI technique we call Deep Emotional Analysis Learning.”

They say the technique is “rooted in complex mathematical algorithms” and can even be used to adapt to the ever-changing vocal patterns of chickens, meaning that it only gets better at deciphering “chicken vocalizations” over time.

Quantum behavior is a strange, fragile thing that hovers on the edge of reality, between a world of possibility and a Universe of absolutes. In that mathematical haze lies the potential of quantum computing; the promise of devices that could quickly solve algorithms that would take classic computers too long to process.

For now, quantum computers are confined to cool rooms close to absolute zero (−273 degrees Celsius) where particles are less likely to tumble out of their critical quantum states.

Breaking through this temperature barrier to develop materials that still exhibit quantum properties at room temperatures has long been the goal of quantum computing. Though the low temperatures help keep the particle’s properties from collapsing out of their useful fog of possibility, the bulk and expense of the equipment limits their potential and ability to be scaled up for general use.

The API-AI nexus isn’t just for tech enthusiasts; its influence has widespread real-world implications. Consider the healthcare sector, where APIs can allow diagnostic AI algorithms to access patient medical records while adhering to privacy regulations. In the financial sector, advanced APIs can connect risk-assessment AIs to real-time market data. In education, APIs can provide the data backbone for AI algorithms designed to create personalized, adaptive learning paths.

However, this fusion of AI and APIs also raises critical questions about data privacy, ethical use and governance. As we continue to knit together more aspects of our digital world, these concerns will need to be addressed to foster a harmonious and responsible AI-API ecosystem.

We stand at the crossroads of a monumental technological paradigm shift. As AI continues to advance, APIs are evolving in parallel to unlock and amplify this potential. If you’re in the realm of digital products, the message is clear: The future is not just automated; it’s API-fied. Whether you’re a developer, a business leader or an end user, this new age promises unprecedented levels of interaction, personalization and efficiency—but it’s upon us to navigate it responsibly.

Kevin Slagle, Quantum 7, 1113 (2023). Although tensor networks are powerful tools for simulating low-dimensional quantum physics, tensor network algorithms are very computationally costly in higher spatial dimensions. We introduce $\textit{quantum gauge networks}$: a different kind of tensor network ansatz for which the computation cost of simulations does not explicitly increase for larger spatial dimensions. We take inspiration from the gauge picture of quantum dynamics, which consists of a local wavefunction for each patch of space, with neighboring patches related by unitary connections. A quantum gauge network (QGN) has a similar structure, except the Hilbert space dimensions of the local wavefunctions and connections are truncated. We describe how a QGN can be obtained from a generic wavefunction or matrix product state (MPS). All $2k$-point correlation functions of any wavefunction for $M$ many operators can be encoded exactly by a QGN with bond dimension $O(M^k)$. In comparison, for just $k=1$, an exponentially larger bond dimension of $2^{M/6}$ is generically required for an MPS of qubits. We provide a simple QGN algorithm for approximate simulations of quantum dynamics in any spatial dimension. The approximate dynamics can achieve exact energy conservation for time-independent Hamiltonians, and spatial symmetries can also be maintained exactly. We benchmark the algorithm by simulating the quantum quench of fermionic Hamiltonians in up to three spatial dimensions.

When people program new deep learning AI models — those that can focus on the right features of data by themselves — the vast majority rely on optimization algorithms, or optimizers, to ensure the models have a high enough rate of accuracy. But one of the most commonly used optimizers — derivative-based optimizers— run into trouble handling real-world applications.

In a new paper, researchers from DeepMind propose a new way: Optimization by PROmpting (OPRO), a method that uses AI large language models (LLM) as optimizers. The unique aspect of this approach is that the optimization task is defined in natural language rather than through formal mathematical definitions.

The researchers write, “Instead of formally defining the optimization problem and deriving the update step with a programmed solver, we describe the optimization problem in natural language, then instruct the LLM to iteratively generate new solutions based on the problem description and the previously found solutions.”

Farmers across the United States will be able to monitor their crops in real time, thanks to a novel algorithm from researchers in South Dakota State University’s Geospatial Sciences Center of Excellence.

Two years ago, Yu Shen, research assistant in the GSCE, and Xiaoyang Zhang, professor in the Department of Geography and Geospatial Sciences and co-director at the GSCE, began investigating if it would be possible to make crop monitoring more efficient.

“Previously, crop progress was monitored by visually looking at the plants,” Shen explained.

For eons, deoxyribonucleic acid (DNA) has served as a sort of instruction manual for life, providing not just templates for a vast array of chemical structures but a means of managing their production.

In recent years engineers have explored a subtly new role for the molecule’s unique capabilities, as the basis for a biological computer. Yet in spite of the passing of 30 years since the first prototype, most DNA computers have struggled to process more than a few tailored algorithms.

A team researchers from China has now come up with a DNA integrated circuit (DIC) that’s far more general purpose. Their liquid computer’s gates can form an astonishing 100 billion circuits, showing its versatility with each capable of running its own program.

Here’s my new article for Aporia magazine, the final futurist story in my 4-part series for them!


Written by Zoltan Istvan.

I met my wife on Match.com 15 years ago. She didn’t have a picture on her profile, but she had written a strong description of herself. It was enough to warrant a first date, and we got married a year later.

But what if ordinary dating sites allowed users to see their potential date naked using advanced AI that could “virtually undress” that person? Let’s take it a step further. What if they gave users the option to have virtual sex with their potential date using deepfake technology, before they ever met them in person? Some of this technology is already here. And it’s prompting a lot of thorny questions – not just for dating sites but for anyone who uses the web.

Whether you like it or not, people are increasingly seeing art that was generated by computers. Everyone has an opinion about it, but researchers at the University of Vienna recently ran a small study to find out how people actually perceive computer-generated art.

In the study, led by Theresa Demmer, people were shown abstract art of black and white blocks in a grid. The art was either generated by a human artist or by a random number generator.

“For the computer-generated images, we avoided using AI or a self-learning algorithm trained on human-generated images but chose to use a very simple algorithm instead,” Demmer told the University of Vienna. “The goal of this approach was to produce… More.


Researchers at the University of Vienna recently ran a small study to find out how people perceive computer-generated art.