Toggle light / dark theme

Nowadays, artificial neural networks have an impact on many areas of our day-to-day lives. They are used for a wide variety of complex tasks, such as driving cars, performing speech recognition (for example, Siri, Cortana, Alexa), suggesting shopping items and trends, or improving visual effects in movies (e.g., animated characters such as Thanos from the movie Infinity War by Marvel).

Traditionally, algorithms are handcrafted to solve complex tasks. This requires experts to spend a significant amount of time to identify the optimal strategies for various situations. Artificial neural networks — inspired by interconnected neurons in the brain — can automatically learn from data a close-to-optimal solution for the given objective. Often, the automated learning or “training” required to obtain these solutions is “supervised” through the use of supplementary information provided by an expert. Other approaches are “unsupervised” and can identify patterns in the data. The mathematical theory behind artificial neural networks has evolved over several decades, yet only recently have we developed our understanding of how to train them efficiently. The required calculations are very similar to those performed by standard video graphics cards (that contain a graphics processing unit or GPU) when rendering three-dimensional scenes in video games.

Within a week, many world leaders went from downplaying the seriousness of coronavirus to declaring a state of emergency. Even the most efficacious of nations seem to be simultaneously confused and exasperated, with delayed responses revealing incompetence and inefficiency the world over.

So this begs the question: why is it so difficult for us to comprehend the scale of what an unmitigated global pandemic could do? The answer likely relates to how we process abstract concepts like exponential growth. Part of the reason we’ve struggled so much applying basic math to our practical environment is because humans think linearly. But like much of technology, biological systems such as viruses can grow exponentially.

As we scramble to contain and fight the pandemic, we’ve turned to technology as our saving grace. In doing so, we’ve effectively hit a “fast-forward” button on many tech trends that were already in place. From remote work and virtual events to virus-monitoring big data, technologies that were perhaps only familiar to a fringe tech community are now entering center stage—and as tends to be the case with wartime responses, these changes are likely here to stay.

Mathematicians from the California Institute of Technology have solved an old problem related to a mathematical process called a random walk.

The team, which also worked with a colleague from Israel’s Ben-Gurion University, solved the problem in a rush after having a realization one evening. Lead author Omer Tamuz studies both economics and mathematics, using probability theory and ergodic theory as the link—a progressive and blended approach that this year’s Abel Prize-winning mathematicians helped to trailblaze.

The Royal Society is to create a network of disease modelling groups amid academic concern about the nation’s reliance on a single group of epidemiologists at Imperial College London whose predictions have dominated government policy, including the current lockdown.

It is to bring in modelling experts from fields as diverse as banking, astrophysics and the Met Office to build new mathematical representations of how the coronavirus epidemic is likely to spread across the UK — and how the lockdown can be ended.

The first public signs of academic tensions over Imperial’s domination of the debate came when Sunetra Gupta, professor of theoretical epidemiology at Oxford University, published a paper suggesting that some of Imperial’s key assumptions could be wrong.

Governments across the world are relying on mathematical projections to help guide decisions in this pandemic. Computer simulations account for only a fraction of the data analyses that modelling teams have performed in the crisis, Ferguson notes, but they are an increasingly important part of policymaking. But, as he and other modellers warn, much information about how SARS-CoV-2 spreads is still unknown and must be estimated or assumed — and that limits the precision of forecasts. An earlier version of the Imperial model, for instance, estimated that SARS-CoV-2 would be about as severe as influenza in necessitating the hospitalization of those infected. That turned out to be incorrect.


How epidemiologists rushed to model the coronavirus pandemic.

Using the same technology that allows high-frequency signals to travel on regular phone lines, researchers tested sending extremely high-frequency, 200 GHz signals through a pair of copper wires. The result is a link that can move data at rates of terabits per second, significantly faster than currently available channels.

While the technology to disentangle multiple, parallel signals moving through a already exists, thanks to signal processing methods developed by John Cioffi, the inventor of digital subscriber lines, or DSL, questions remained related to the effectiveness of implementing these ideas at higher frequencies.

To test the transmission of at higher frequencies, authors of a paper published this week in Applied Physics Letters used experimental measurements and mathematical modeling to characterize the input and output signals in a .