[TOC]
This page is a collection of various lecture notes and web-pages which explain Back-Propagation.
Back-propagation is a commonly used method for training Neural Networks.
- CS-231N: Convolutional Neural Networks for Visual Recognition
- Stanford CS-231N (Spring 2017) Syllabus
- I posted the link to the Spring 2017 version since it has YouTube videos.
- Backpropagation is covered in Lecture 4.
- CS-224N: Natural Language Processing with Deep Learning
- Stanford CS-224N (Winter 2019 Link)
- I posted the link to the Winter 2019 version since it has YouTube videos.
- Backpropagation is covered in Lecture 4.
- CS-131: Computer Vision: Foundations and Applications:
- Stanford CS-131 Fall 2019
- More general course; Backpropagation is covered in Lectures 19 and 20
- Review of "The Matrix Calculus you need for Deep Learning" lead by Joseph Catanzarite
- Set of three review lectures; Great mathematical detail
- Youtube Playlist
- 3Blue1Brown Neural Network playlist
- This is amazing!
- Essence of linear Algebra playlist
- Multivariable functions and Multivariable Calculus
- Deep Learning cheat sheet by Shervine for Stanford CS-229
- This is amazing!