Toggle light / dark theme

Your weekly news from the AI & Machine Learning world.

OUTLINE:
0:00 — Introduction.
0:25 — AI reads brain signals to predict what you’re thinking.
3:00 — Closed-form solution for neuron interactions.
4:15 — GPT-4 rumors.
6:50 — Cerebras supercomputer.
7:45 — Meta releases metagenomics atlas.
9:15 — AI advances in theorem proving.
10:40 — Better diffusion models with expert denoisers.
12:00 — BLOOMZ & mT0
13:05 — ICLR reviewers going mad.
21:40 — Scaling Transformer inference.
22:10 — Infinite nature flythrough generation.
23:55 — Blazing fast denoising.
24:45 — Large-scale AI training with MultiRay.
25:30 — arXiv to include Hugging Face spaces.
26:10 — Multilingual Diffusion.
26:30 — Music source separation.
26:50 — Multilingual CLIP
27:20 — Drug response prediction.
27:50 — Helpful Things.

ERRATA:
HF did not acquire spaces, they launched spaces themselves and supported Gradio from the start. They later acquired Gradio.

References:
AI reads brain signals to predict what you’re thinking.
https://mind-vis.github.io/?s=09&utm_source=pocket_saves.

Brain-Machine Interface Device Predicts Internal Speech

Closed-form solution for neuron interactions.

https://github.com/raminmh/CfC/blob/main/torch_cfc.py.

GPT-4 rumors.
https://thealgorithmicbridge.substack.com/p/gpt-4-rumors-fro…ket_reader.

NASA’s Discover supercomputer simulated the extreme conditions of the distant cosmos.

A team of scientists from NASA’s Goddard Space Flight Center used the U.S. space agency’s Center for Climate Simulation (NCCS) Discover supercomputer to run 100 simulations of jets emerging from supermassive black holes.

The scientists set out to better understand these jets — massive beams of energetic particles shooting out into the cosmos — as they play a crucial role in the evolution of the universe.

Scientists from the Dutch Institute for Fundamental Energy Research (DIFFER) have created a database of 31,618 molecules that could potentially be used in future redox-flow batteries. These batteries hold great promise for energy storage. Among other things, the researchers used artificial intelligence and supercomputers to identify the molecules’ properties. Today, they publish their findings in the journal Scientific Data.

In recent years, chemists have designed hundreds of molecules that could potentially be useful in flow batteries for energy storage. It would be wonderful, researchers from DIFFER in Eindhoven (the Netherlands) imagined, if the properties of these molecules were quickly and easily accessible in a database. The problem, however, is that for many molecules the properties are not known. Examples of molecular properties are redox potential and water solubility. Those are important since they are related to the power generation capability and energy density of redox flow batteries.

To find out the still-unknown properties of molecules, the researchers performed four steps. First, they used a and smart algorithms to create thousands of virtual variants of two types of molecules. These molecule families, the quinones and aza aromatics, are good at reversibly accepting and donating electrons. That is important for batteries. The researchers fed the computer with backbone structures of 24 quinones and 28 aza-aromatics plus five different chemically relevant side groups. From that, the computer created 31,618 different molecules.

A multi-institution research team has developed an optical chip that can train machine learning hardware. Their research is published today in Optica.

Machine learning applications have skyrocketed to $165 billion annually, according to a recent report from McKinsey. But before a machine can perform intelligence tasks such as recognizing the details of an image, it must be trained. Training of modern-day (AI) systems like Tesla’s autopilot costs several million dollars in electric power consumption and requires supercomputer-like infrastructure.

This surging AI “appetite” leaves an ever-widening gap between computer hardware and demand for AI. Photonic integrated circuits, or simply optical chips, have emerged as a possible solution to deliver higher computing performance, as measured by the number of operations performed per second per watt used, or TOPS/W. However, though they’ve demonstrated improved core operations in machine intelligence used for data classification, photonic chips have yet to improve the actual front-end learning and machine training process.

Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED
AI News Timestamps:
0:00 New AI Robot Dog Beats Human Soccer Skills.
2:34 Breakthrough Humanoid Robotics & AI Tech.
5:21 Google AI Makes HD Video From Text.
8:41 New OpenAI DALL-E Robotics.
11:31 Elon Musk Reveals Tesla Optimus AI Robot.
16:49 Machine Learning Driven Exoskeleton.
19:33 Google AI Makes Video Game Objects From Text.
22:12 Breakthrough Tesla AI Supercomputer.
25:32 Underwater Drone Humanoid Robot.
29:19 Breakthrough Google AI Edits Images With Text.
31:43 New Deep Learning Tech With Light waves.
34:50 Nvidia General Robot Manipulation AI
36:31 Quantum Computer Breakthrough.
38:00 In-Vitro Neural Network Plays Video Games.
39:56 Google DeepMind AI Discovers New Matrices Algorithms.
45:07 New Meta Text To Video AI
48:00 Bionic Tech Feels In Virtual Reality.
53:06 Quantum Physics AI
56:40 Soft Robotics Gripper Learns.
58:13 New Google NLP Powered Robotics.
59:48 Ionic Chips For AI Neural Networks.
1:02:43 Machine Learning Interprets Brain Waves & Reads Mind.

At the time, all this was theoretical. But last week, the company announced they’d linked 16 CS-2s together into a world-class AI supercomputer.

Meet Andromeda

The new machine, called Andromeda, has 13.5 million cores capable of speeds over an exaflop (one quintillion operations per second) at 16-bit half precision. Due to the unique chip at its core, Andromeda isn’t easily compared to supercomputers running on more traditional CPUs and GPUs, but Feldman told HPC Wire Andromeda is roughly equivalent to Argonne National Laboratory’s Polaris supercomputer, which ranks 17th fastest in the world, according to the latest Top500 list.

Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.

A supercomputer, providing massive amounts of computing power to tackle complex challenges, is typically out of reach for the average enterprise data scientist. However, what if you could use cloud resources instead? That’s the rationale that Microsoft Azure and Nvidia are taking with this week’s announcement designed to coincide with the SC22 supercomputing conference.

Nvidia and Microsoft announced that they are building a “massive cloud AI computer.” The supercomputer in question, however, is not an individually-named system, like the Frontier system at the Oak Ridge National Laboratory or the Perlmutter system, which is the world’s fastest Artificial Intelligence (AI) supercomputer. Rather, the new AI supercomputer is a set of capabilities and services within Azure, powered by Nvidia technologies, for high performance computing (HPC) uses.

Could energy efficiency be quantum computers’ greatest strength yet?

Quantum computers have attracted considerable interest of late for their potential to crack problems in a few hours where they might take the age of the universe (i.e., tens of billions of years) on the best supercomputers. Their real-life applications range from drug and materials design to solving complex optimization problems. They are, therefore, primarily intended for scientific and industrial research.

Traditionally, “quantum supremacy” is sought from the point of view of raw computing power: we want to calculate (much) faster.

However, the question of its energy consumption could also now warrant research, with current supercomputers sometimes consuming as much electricity as a small town (which could in fact limit the increase in their computing power). Information technologies, at their end, accounted for 11% of global electricity consumption in 2020.

## Why focus on the energy consumption of quantum computers?

Since a quantum computer can solve problems in a few hours, whereas a supercomputer might take several tens of billions of years, it is natural to expect it will consume much less energy. However, manufacturing such powerful quantum computers will require that we solve many scientific and technological challenges, potentially over one to several decades of research.

Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED
Nuclear fusion researchers have created a machine learning AI algorithm to detect and track the existence of plasma blobs that build up inside the tokamak for prediction of plasma disruption, the diagnosis of plasma using spectroscopy and tomography, and the tracking of turbulence inside of the fusion reactor. New AI supercomputer with over 13.5 million processor cores and over 1 exaflop of compute power made be Cerebras. A new study reveals an innovative neuro-computational model of the human brain which could lead to the creation of conscious AI or artificial general intelligence (AGI).

AI News Timestamps:
0:00 Breakthrough AI Runs A Nuclear Fusion Reactor.
3:07 New AI Supercomputer.
6:19 New Brain Model For Conscious AI

#ai #ml #nuclear