Toggle light / dark theme

This Machine Learning Paper Presents a General Data Generation Process for Non-Stationary Time Series Forecasting

One of the cornerstone challenges in machine learning, time series forecasting has made groundbreaking contributions to several domains. However, forecasting models can’t generalize the distribution shift that changes with time because time series data is inherently non-stationary. Based on the assumptions about the inter-instance and intra-instance temporal distribution shifts, two main types of techniques have been suggested to address this issue. Both stationary and nonstationary dependencies can be separated using these techniques. Existing approaches help reduce the impact of the shift in the temporal distribution. Still, they are overly prescriptive because, without known environmental labels, every sequence instance or segment might not be stable.

Before learning about the changes in the stationary and nonstationary states throughout time, there is a need to identify when the shift in the temporal distribution takes place. By assuming nonstationarity in observations, it is possible to theoretically identify the latent environments and stationary/nonstationary variables according to this understanding.

https://marktechpost-newsletter.beehiiv.com/subscribe.

The Fermi Paradox: Absent Megastructures

The great mystery of where all the aliens are in our vast Universe contemplates ancient interstellar civilizations building enormous megastructures that rival worlds or even stars in the immensity… and asks why we can’t see these giant alien artifacts.

David Brin on Event Horizon with John Michael Godier: • A.I. Wars, The Fermi Paradox and Grea…
This Week in Space with Rod Pyle: • Alien Megastructures — Isaac Arthur a…

Sign up for a Curiosity Stream subscription and also get a free Nebula subscription (the streaming platform built by creators) here: https://curiositystream.com/isaacarthur.
Visit our Website: http://www.isaacarthur.net.
Join Nebula: https://go.nebula.tv/isaacarthur.
Support us on Patreon: / isaacarthur.
Support us on Subscribestar: https://www.subscribestar.com/isaac-a
Facebook Group: / 1583992725237264
Reddit: / isaacarthur.
Twitter: / isaac_a_arthur on Twitter and RT our future content.
SFIA Discord Server: / discord.

Listen or Download the audio of this episode from Soundcloud: Episode’s Audio-only version: / the-fermi-paradox-absent-megastructures.
Episode’s Narration-only version: / the-fermi-paradox-absent-megastructures-na…

Credits:
The Fermi Paradox: Absent Megastructures.
Science \& Futurism with Isaac Arthur.
Episode 352, July 21, 2022
Written, Produced \& Narrated by Isaac Arthur.

Editors:

Engineers collaborate with ChatGPT4 to design brain-inspired chips

Johns Hopkins electrical and computer engineers are pioneering a new approach to creating neural network chips—neuromorphic accelerators that could power energy-efficient, real-time machine intelligence for next-generation embodied systems like autonomous vehicles and robots.

Electrical and computer engineering graduate student Michael Tomlinson and undergraduate Joe Li—both members of the Andreou Lab—used natural language prompts and ChatGPT4 to produce detailed instructions to build a spiking neural network chip: one that operates much like the human brain.

Through step-by-step prompts to ChatGPT4, starting with mimicking a single biological neuron and then linking more to form a network, they generated a full that could be fabricated.

Gödel’s Incompleteness Theorem and the Limits of AI

Gödel’s Incompleteness theorems are two theorems of mathematical logic that demonstrate the inherent limitations of every formal axiomatic system capable of modelling basic arithmetic.

The first incompleteness theorem: No consistent formal system capable of modelling basic arithmetic can be used to prove all truths about arithmetic.

In other words, no matter how complex a system of mathematics is, there will always be some statements about numbers that cannot be proved or disproved within the system.

How AI and high-performance computing are speeding up scientific discovery

Computing has already accelerated scientific discovery. Now scientists say a combination of advanced AI with next-generation cloud computing is turbocharging the pace of discovery to speeds unimaginable just a few years ago.

Microsoft and the Pacific Northwest National Laboratory (PNNL) in Richland, Washington, are collaborating to demonstrate how this acceleration can benefit chemistry and materials science – two scientific fields pivotal to finding energy solutions that the world needs.

Scientists at PNNL are testing a new battery material that was found in a matter of weeks, not years, as part of the collaboration with Microsoft to use to advanced AI and high-performance computing (HPC), a type of cloud-based computing that combines large numbers of computers to solve complex scientific and mathematical tasks.

/* */