Skip to content

Notes from diving deep into PyTorch for Deep Learning: General Concepts with Code

Notifications You must be signed in to change notification settings

briannaflynn/DL-PyTorch-Notes

Repository files navigation

DL-PyTorch-Notes

Notes from diving deep into implementing PyTorch for Deep Learning: General Concepts with Code Examples

Goes over:

  • Why good initialization matters - and why we want activations to have a mean close to 0 and a standard deviation close to 1
  • Variance, Standard Deviation, and Mean Absolute Deviation - when to use each and why
  • Activation explosion
  • Basic training loop
  • Convolutions
  • Callbacks and why to use them
  • Math essentials
  • Hooks (Pytorch's version of callbacks)
  • When and why to use Softmax vs Binomial when interpreting activations
  • Batch Normalization
  • Exponentially weighted moving averages
  • Small Batch / Batch of 1 Normalization Techiques

And more!

About

Notes from diving deep into PyTorch for Deep Learning: General Concepts with Code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published