The main ideas behind Backpropagation are super simple, but there are tons of details when it comes time to implementing it. This video shows how to optimize three parameters in a Neural Network simultaneously and introduces some Fancy Notation.
NOTE: This StatQuest assumes that you already know the main ideas behind Backpropagation: • Neural Networks Pt. 2: Backpropagation Mai...
...and that also means you should be familiar with...
Neural Networks: • The Essential Main Ideas of Neural Networks
The Chain Rule: • The Chain Rule, Clearly Explained!!!
Gradient Descent: • Gradient Descent, Step-by-Step
LAST NOTE: When I was researching this 'Quest, I found this page by Sebastian Raschka to be helpful: sebastianraschka.com/faq/docs/backprop-arbitrary.h…
For a complete index of all the StatQuest videos, check out:
statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Patreon: www.patreon.com/statquest
...or...
YouTube Membership: youtube.com/channel/UCtYLUTtgS3k1Fg4y5tAhLbw/join
...buying one of my books, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
statquest.org/statquest-store/
...or just donating to StatQuest!
www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
twitter.com/joshuastarmer
0:00 Awesome song and introduction
3:01 Derivatives do not change when we optimize multiple parameters
6:28 Fancy Notation
10:51 Derivatives with respect to two different weights
15:02 Gradient Descent for three parameters
17:19 Fancy Gradient Descent Animation
#StatQuest #NeuralNetworks #Backpropagation
コメント