Toggle light / dark theme

Europe’s fastest supercomputer just connected to a quantum computer in Finland — here’s why

The merged computing power can give rise to faster and more accurate machine learning applications.

Last month, LUMI, the fastest supercomputer in Europe, was connected to HELMI, Finland’s first quantum computer, a five-qubit system operational since 2021. This makes Finland the first country in Europe to have created such a hybrid system — it is one of the few countries worldwide to have done the same.

LUMI is famous — the supercomputer ranks third in the latest Top 500 list of the world’s fastest supercomputer and can carry out 309 petaflops. LUMI, too became operational in 2021.

VTT Technical Research Centre of Finland worked with CSC and Aalto University, within the Finnish Quantum Computing Infrastructure framework, to make the connection between the computers, according to a release.

Fractal parallel computing, a geometry-inspired productivity booster

When trying to make a purchase with a shopping app, we may quickly browse the recommendation list while admitting that the machine does know about us—at least, it is learning to do so. As an effective emerging technology, machine learning (ML) has become pretty much pervasive with an application spectrum ranging from miscellaneous apps to supercomputing.

Dedicated ML computers are thus being developed at various scales, but their productivity is somewhat limited: the workload and development cost are largely concentrated in their software stacks, which need to be developed or reworked on an ad hoc basis to support every scaled model.

To solve the problem, researchers from the Chinese Academy of Sciences (CAS) proposed a parallel computing model and published their research in Intelligent Computing on Sept. 5.

Elon Musk JUST REVEALED His NEW Secret Weapon That Will CHANGE EVERYTHING!

Elon musk JUST REVEALED powerful dojo supercomputer that tripped the power grid!

🔔 Subscribe now with all notifications on for more Elon Musk, SpaceX, and Tesla videos!

Elon’s breaking inventions have always been taking over the internet and this time, yet again his latest invention has been making headlines and people are going crazy over it! Elon musk has just revealed the powerful dojo supercomputer that tripped the power grid!

But what exactly is this power grid? How does it help? What is Elon planning to do with it?

📺 Watch the entire video for more information!

#elonmusk #tesla #spacex.

[ML News] GPT-4 Rumors | AI Mind Reading | Neuron Interaction Solved | AI Theorem Proving

Your weekly news from the AI & Machine Learning world.

OUTLINE:
0:00 — Introduction.
0:25 — AI reads brain signals to predict what you’re thinking.
3:00 — Closed-form solution for neuron interactions.
4:15 — GPT-4 rumors.
6:50 — Cerebras supercomputer.
7:45 — Meta releases metagenomics atlas.
9:15 — AI advances in theorem proving.
10:40 — Better diffusion models with expert denoisers.
12:00 — BLOOMZ & mT0
13:05 — ICLR reviewers going mad.
21:40 — Scaling Transformer inference.
22:10 — Infinite nature flythrough generation.
23:55 — Blazing fast denoising.
24:45 — Large-scale AI training with MultiRay.
25:30 — arXiv to include Hugging Face spaces.
26:10 — Multilingual Diffusion.
26:30 — Music source separation.
26:50 — Multilingual CLIP
27:20 — Drug response prediction.
27:50 — Helpful Things.

ERRATA:
HF did not acquire spaces, they launched spaces themselves and supported Gradio from the start. They later acquired Gradio.

References:
AI reads brain signals to predict what you’re thinking.
https://mind-vis.github.io/?s=09&utm_source=pocket_saves.

Brain-Machine Interface Device Predicts Internal Speech

Closed-form solution for neuron interactions.

https://github.com/raminmh/CfC/blob/main/torch_cfc.py.

GPT-4 rumors.
https://thealgorithmicbridge.substack.com/p/gpt-4-rumors-fro…ket_reader.

NASA uses a climate simulation supercomputer to better understand black hole jets

NASA’s Discover supercomputer simulated the extreme conditions of the distant cosmos.

A team of scientists from NASA’s Goddard Space Flight Center used the U.S. space agency’s Center for Climate Simulation (NCCS) Discover supercomputer to run 100 simulations of jets emerging from supermassive black holes.

The scientists set out to better understand these jets — massive beams of energetic particles shooting out into the cosmos — as they play a crucial role in the evolution of the universe.

Researchers publish 31,618 molecules with potential for energy storage in batteries

Scientists from the Dutch Institute for Fundamental Energy Research (DIFFER) have created a database of 31,618 molecules that could potentially be used in future redox-flow batteries. These batteries hold great promise for energy storage. Among other things, the researchers used artificial intelligence and supercomputers to identify the molecules’ properties. Today, they publish their findings in the journal Scientific Data.

In recent years, chemists have designed hundreds of molecules that could potentially be useful in flow batteries for energy storage. It would be wonderful, researchers from DIFFER in Eindhoven (the Netherlands) imagined, if the properties of these molecules were quickly and easily accessible in a database. The problem, however, is that for many molecules the properties are not known. Examples of molecular properties are redox potential and water solubility. Those are important since they are related to the power generation capability and energy density of redox flow batteries.

To find out the still-unknown properties of molecules, the researchers performed four steps. First, they used a and smart algorithms to create thousands of virtual variants of two types of molecules. These molecule families, the quinones and aza aromatics, are good at reversibly accepting and donating electrons. That is important for batteries. The researchers fed the computer with backbone structures of 24 quinones and 28 aza-aromatics plus five different chemically relevant side groups. From that, the computer created 31,618 different molecules.

An optical chip that can train machine learning hardware

A multi-institution research team has developed an optical chip that can train machine learning hardware. Their research is published today in Optica.

Machine learning applications have skyrocketed to $165 billion annually, according to a recent report from McKinsey. But before a machine can perform intelligence tasks such as recognizing the details of an image, it must be trained. Training of modern-day (AI) systems like Tesla’s autopilot costs several million dollars in electric power consumption and requires supercomputer-like infrastructure.

This surging AI “appetite” leaves an ever-widening gap between computer hardware and demand for AI. Photonic integrated circuits, or simply optical chips, have emerged as a possible solution to deliver higher computing performance, as measured by the number of operations performed per second per watt used, or TOPS/W. However, though they’ve demonstrated improved core operations in machine intelligence used for data classification, photonic chips have yet to improve the actual front-end learning and machine training process.

Artificial Intelligence & Robotics Tech News For October 2022

Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED
AI News Timestamps:
0:00 New AI Robot Dog Beats Human Soccer Skills.
2:34 Breakthrough Humanoid Robotics & AI Tech.
5:21 Google AI Makes HD Video From Text.
8:41 New OpenAI DALL-E Robotics.
11:31 Elon Musk Reveals Tesla Optimus AI Robot.
16:49 Machine Learning Driven Exoskeleton.
19:33 Google AI Makes Video Game Objects From Text.
22:12 Breakthrough Tesla AI Supercomputer.
25:32 Underwater Drone Humanoid Robot.
29:19 Breakthrough Google AI Edits Images With Text.
31:43 New Deep Learning Tech With Light waves.
34:50 Nvidia General Robot Manipulation AI
36:31 Quantum Computer Breakthrough.
38:00 In-Vitro Neural Network Plays Video Games.
39:56 Google DeepMind AI Discovers New Matrices Algorithms.
45:07 New Meta Text To Video AI
48:00 Bionic Tech Feels In Virtual Reality.
53:06 Quantum Physics AI
56:40 Soft Robotics Gripper Learns.
58:13 New Google NLP Powered Robotics.
59:48 Ionic Chips For AI Neural Networks.
1:02:43 Machine Learning Interprets Brain Waves & Reads Mind.

This AI Supercomputer Has 13.5 Million Cores—and Was Built in Just Three Days

At the time, all this was theoretical. But last week, the company announced they’d linked 16 CS-2s together into a world-class AI supercomputer.

Meet Andromeda

The new machine, called Andromeda, has 13.5 million cores capable of speeds over an exaflop (one quintillion operations per second) at 16-bit half precision. Due to the unique chip at its core, Andromeda isn’t easily compared to supercomputers running on more traditional CPUs and GPUs, but Feldman told HPC Wire Andromeda is roughly equivalent to Argonne National Laboratory’s Polaris supercomputer, which ranks 17th fastest in the world, according to the latest Top500 list.

/* */