Toggle light / dark theme

In a classic episode of an old-school TV comedy called I Love Lucy, we see Lucille Ball stepping into an assembly line job at a candy factory. As the pace of the conveyor belt exceeds her ability to wrap the candy, the frenzy gets the best of her. She shoves candy into her pockets, into her hat, into her mouth—it’s a job fail.

As we know, faster doesn’t always mean better. And precision can take a big bite out of speed.

Sometimes, though, innovative minds come up with a new strategy that improves both efficiency and precision.

This video was made possible by Squarespace. Sign up with this link and get 10% off your purchase of a website or domain after your free trial! https://squarespace.com/singularity

In the last video in this series, we discussed the biologically inspired structure of deep leaning neural networks and built up an abstracted model based on that. We then went through the basics of how this model is able to form representations from input data.

The focus of this video then will continue right where the last one left off, as we delve deeper into the structure and mathematics of neural nets to see how they form their pattern recognition capabilities!

Thank you to the patron(s) who supported this video ➤