Notes from diving deep into implementing PyTorch for Deep Learning: General Concepts with Code Examples
Goes over:
- Why good initialization matters - and why we want activations to have a mean close to 0 and a standard deviation close to 1
- Variance, Standard Deviation, and Mean Absolute Deviation - when to use each and why
- Activation explosion
- Basic training loop
- Convolutions
- Callbacks and why to use them
- Math essentials
- Hooks (Pytorch's version of callbacks)
- When and why to use Softmax vs Binomial when interpreting activations
- Batch Normalization
- Exponentially weighted moving averages
- Small Batch / Batch of 1 Normalization Techiques
And more!