Skip to content

Notes from diving deep into PyTorch for Deep Learning: General Concepts with Code

Notifications You must be signed in to change notification settings

briannaflynn/DL-PyTorch-Notes

Repository files navigation

DL-PyTorch-Notes

Notes from diving deep into PyTorch for Deep Learning: General Concepts with Code Examples

Goes over:

  • Why good initialization matters - and why we want activations to have a mean close to 0 and a standard deviation close to 1
  • Variance, Standard Deviation, and Mean Absolute Deviation - when to use each and why
  • Activation explosion - what it is, why it's bad, and how to prevent it
  • Basic training loop
  • Convolutions
  • Callbacks and why to use them
  • Math essentials
  • Hooks (Pytorch's version of callbacks)
  • When and why to use Softmax vs Binomial when interpreting activations
  • Hyperparameter scheduling - Why treating a model differently at different stages of the training 'life cycle' often leads to better performance
  • Batch Normalization
  • Exponentially weighted moving averages
  • Use of epsilon as a hyperparameter
  • Small Batch / Batch of 1 Normalization Techiques

And more!

About

Notes from diving deep into PyTorch for Deep Learning: General Concepts with Code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published