-
Common ML problems, overview of (a) supervised, (b) unsupervised, and (c) reinforcement learning
-
ML terminology, linear regression, training & loss, gradient descent, stochastic gradient descent
-
Linear regression using the normal equation - numpy implementation
To understand the mathematics underlying the normal equation, read the following materials:
Chapter 4 Numerical Computation, Section 4.3 Gradient-Based Optimization
Chapter 5 Machine Learning Basics, Subsection 5.1.4 Example: Linear Regression
Additional materials: proof of convexity of MSE and computation of gradient of MSE
Colab notebook for solving linear regression using normal equation
-
Effect of learning rate on gradient descent
Colab notebook for experimenting with different learning rates
-
Linear regression using gradient descent - numpy implementation
Colab notebook for solving linear regression using gradient descent
-
Overview of TensorFlow and Keras
-
Keras examples
Colab notebook for solving linear regression for artificial data set
Colab notebook for loading and exploring the MNIST digits data set
Colab notebook for classifying MNIST digits with dense layers and analyzing model performance
Colab notebook for classifying MNIST fashion items with dense layers and analyzing model performance
Colab notebook for displaying CIFAR10 data set
Try to use dense layers to classify the CIFAR10 images.
Colab notebook for predicting fuel efficiency using the Auto MPG data set
-
Generalization, overfitting, splitting data in train & test sets
-
Validation
-
Logistic regression, gradients for squared error and binary cross-entropy loss functions
-
Mathematical formulation of artificial neural networks, back propagation
-
More Keras examples
-
Deep learning for computer vision (CNN)
-
Deep learning for text and sequences (RNN, LSTM)
-
Generative deep learning (VAE, GAN)
Tools, additional materials
-
Google's machine learning materials
Nice visualizations of the neural networks.