Toggle light / dark theme

DeepMind AI invents faster algorithms to solve tough maths puzzles

Researchers at DeepMind in London have shown that artificial intelligence (AI) can find shortcuts in a fundamental type of mathematical calculation, by turning the problem into a game and then leveraging the machine-learning techniques that another of the company’s AIs used to beat human players in games such as Go and chess.

The AI discovered algorithms that break decades-old records for computational efficiency, and the team’s findings, published on 5 October in Nature1, could open up new paths to faster computing in some fields.

“It is very impressive,” says Martina Seidl, a computer scientist at Johannes Kepler University in Linz, Austria. “This work demonstrates the potential of using machine learning for solving hard mathematical problems.”

Ep. 102: Genetic engineering and the biological basis of intelligence. | Steven Hsu

Since the discovery of genetics, people have dreamed of being able to correct diseases, select traits in children before birth, and build better human beings. Naturally, many serious technical and ethical questions surround this endeavor. Luckily, tonights’ guest is as good a guide as we could hope to have.

Dr. Steve Hsu is Professor of Theoretical Physics and of Computational Mathematics, Science, and Engineering at Michigan State University. He has done extensive research in the field of computational genomics, and is the founder of several startups.

#geneticengineering #intelligence

Closed timelike curve

In mathematical physics, a closed timelike curve (CTC) is a world line in a Lorentzian manifold, of a material particle in spacetime, that is “closed”, returning to its starting point. This possibility was first discovered by Willem Jacob van Stockum in 1937[1] and later confirmed by Kurt Gödel in 1949,[2] who discovered a solution to the equations of general relativity (GR) allowing CTCs known as the Gödel metric; and since then other GR solutions containing CTCs have been found, such as the Tipler cylinder and traversable wormholes.

What Are The Odds Of Alien Life? The Drake Equation

Commercial Purposes ► [email protected].

What is the Drake Equation? We are talking about The Odds of ALIEN LIFE.
Is there life out there in the Universe?
How are the chances to find Extraterrestrial life?

We don’t know the answers to a lot of questions, for example:
How many alien societies exist, and are detectable?
Even though we don’t know how to answer such a question, we can at least try to figure it out with a little help from our beloved…Math.
First, we have to have a pretty good idea about how the universe works, and of course about the star and planetary formation, as well as conditions for life as we know it. This means we have to study and collect a lot of data. Luckily for us, we – humans — aren’t so bad. Physics, astronomy, chemistry, biology and all-natural sciences offer us the hints for the mathematical set of parameters that will give us an equation to calculate the number of alien societies that exist and are detectable.
Second, one has to sit down and think about which parameters should appear in the equation, and which not.
Do you think it’s difficult? I think so.
But luckily for us, in 1961 scientists Drake came up with a famous equation, that estimated the number of transmitting societies in the Milky Way Galaxy…

-
“If You happen to see any content that is yours, and we didn’t give credit in the right manner please let us know at [email protected] and we will correct it immediately”

“Some of our visual content is under an Attribution-ShareAlike license. (https://creativecommons.org/licenses/) in its different versions such as 1.0, 2.0, 30, and 4.0 – permitting commercial sharing with attribution given in each picture accordingly in the video.”

Credits: Ron Miller, Mark A. Garlick / MarkGarlick.com.

Greg Yang | Large N Limits: Random Matrices & Neural Networks | The Cartesian Cafe w/ Timothy Nguyen

Greg Yang is a mathematician and AI researcher at Microsoft Research who for the past several years has done incredibly original theoretical work in the understanding of large artificial neural networks. Greg received his bachelors in mathematics from Harvard University in 2018 and while there won the Hoopes prize for best undergraduate thesis. He also received an Honorable Mention for the Morgan Prize for Outstanding Research in Mathematics by an Undergraduate Student in 2018 and was an invited speaker at the International Congress of Chinese Mathematicians in 2019.

In this episode, we get a sample of Greg’s work, which goes under the name “Tensor Programs” and currently spans five highly technical papers. The route chosen to compress Tensor Programs into the scope of a conversational video is to place its main concepts under the umbrella of one larger, central, and time-tested idea: that of taking a large N limit. This occurs most famously in the Law of Large Numbers and the Central Limit Theorem, which then play a fundamental role in the branch of mathematics known as Random Matrix Theory (RMT). We review this foundational material and then show how Tensor Programs (TP) generalizes this classical work, offering new proofs of RMT. We conclude with the applications of Tensor Programs to a (rare!) rigorous theory of neural networks.

Patreon: https://www.patreon.com/timothynguyen.

Part I. Introduction.
00:00:00 : Biography.
00:02:36 : Harvard hiatus 1: Becoming a DJ
00:07:40 : I really want to make AGI happen (back in 2012)
00:09:00 : Harvard math applicants and culture.
00:17:33 : Harvard hiatus 2: Math autodidact.
00:21:51 : Friendship with Shing-Tung Yau.
00:24:06 : Landing a job at Microsoft Research: Two Fields Medalists are all you need.
00:26:13 : Technical intro: The Big Picture.
00:28:12 : Whiteboard outline.

Part II. Classical Probability Theory.
00:37:03 : Law of Large Numbers.
00:45:23 : Tensor Programs Preview.
00:47:25 : Central Limit Theorem.
00:56:55 : Proof of CLT: Moment method.
01:02:00 : Moment method explicit computations.

Part III. Random Matrix Theory.