Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think FastAI lesson 3 in "Practical Deep Learning for Coders", has one of the most intuitive buildups of gradient descent and loss that I've seen. * Lecture [1] Book Chapter [2]

It doesn't go into the math but I don't think that's a bad thing for beginners.

If you want mathematical, 3blue1brown has a great series of videos [3] on the topic.

[1] https://www.youtube.com/watch?v=hBBOjCiFcuo&t=1932s

[2] https://github.com/fastai/fastbook/blob/master/04_mnist_basi...

[3] https://www.youtube.com/watch?v=aircAruvnKk

* I've been messing around with this stuff since 2016 and have done a few different courses like the original Andrew Ng course and more.



I just did the 4th chapter of the book today (04_mnist_basic). Very educational.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: