Toggle light / dark theme

Hydrides are created by combining rare earth metals with hydrogen, then adding nitrogen or carbon. In recent years, they offered scientists a tantalizing “working recipe” for creating superconducting materials.

Technically speaking, rare earth metal hydrides take the form of cage-like structures called clathrates, where the rare earth metal ions serve as carrier donors and supply enough electrons to promote the dissociation of the H2 molecules. Carbon and nitrogen aid in material stabilization. The bottom line is that superconductivity can occur at lower pressures.

Scientists have also employed additional rare earth metals besides yttrium. Yet, the resultant compounds turn superconductive at pressures or temperatures that are still impractical for applications.

Check out the Machine Learning Course on Coursera: https://click.linksynergy.com/deeplink?id=vFuLtrCrRW4&mid=40…p_ml_nov18

STEMerch Store: https://stemerch.com/
Support the Channel: https://www.patreon.com/zachstar.
PayPal(one time donation): https://www.paypal.me/ZachStarYT

Instagram: https://www.instagram.com/zachstar/
Twitter: https://twitter.com/ImZachStar.
Join Facebook Group: https://www.facebook.com/groups/majorprep/

►My Setup:

An introductory lecture for MIT course 6.S094 on the basics of deep learning including a few key ideas, subfields, and the big picture of why neural networks have inspired and energized an entire new generation of researchers. For more lecture videos on deep learning, reinforcement learning (RL), artificial intelligence (AI & AGI), and podcast conversations, visit our website or follow TensorFlow code tutorials on our GitHub repo.

INFO:
Website: https://deeplearning.mit.edu.
GitHub: https://github.com/lexfridman/mit-deep-learning.
Slides: http://bit.ly/deep-learning-basics-slides.
Playlist: http://bit.ly/deep-learning-playlist.
Blog post: https://link.medium.com/TkE476jw2T

OUTLINE:
0:00 — Introduction.
0:53 — Deep learning in one slide.
4:55 — History of ideas and tools.
9:43 — Simple example in TensorFlow.
11:36 — TensorFlow in one slide.
13:32 — Deep learning is representation learning.
16:02 — Why deep learning (and why not)
22:00 — Challenges for supervised learning.
38:27 — Key low-level concepts.
46:15 — Higher-level methods.
1:06:00 — Toward artificial general intelligence.

CONNECT:

What are the neurons, why are there layers, and what is the math underlying it?
Help fund future projects: https://www.patreon.com/3blue1brown.
Written/interactive form of this series: https://www.3blue1brown.com/topics/neural-networks.

Additional funding for this project provided by Amplify Partners.

Typo correction: At 14 minutes 45 seconds, the last index on the bias vector is n, when it’s supposed to in fact be a k. Thanks for the sharp eyes that caught that!

For those who want to learn more, I highly recommend the book by Michael Nielsen introducing neural networks and deep learning: https://goo.gl/Zmczdy.