Tutorial Website: http://ufldl.stanford.edu/wiki/index.php/UFLDL_Tutorial
Sparse Autoencoder vectorized implementation, learning/visualizing features on MNIST data
- ex_1_sparse_autoencoder.py
- ex_2_sparse_autoencoder_vect.py
- load_MNIST.py: Load MNIST images
- sample_images.py: Load sample images for testing sparse auto-encoder
- gradient.py: Functions to compute & check cost and gradient
- display_network.py: Display visualized features
- sparse_autoencoder.py: Sparse autoencoder cost & gradient functions
- train.py: Train sparse autoencoder with MNIST data and visualize learnt featured
Implement PCA, PCA whitening & ZCA whitening
Classify MNIST digits via softmax regression (multivariate logistic regression)
- ex_4_softmax_regression.py: Classify MNIST digits
- softmax.py: Softmax regression cost & gradient functions
Classify MNIST digits via self-taught learning paradigm, i.e. learn features via sparse autoencoder using digits 5-9 as unlabelled examples and train softmax regression on digits 0-4 as labelled examples
- (stl_exercise.py): Classify MNIST digits via self-taught learning
Stacked sparse autoencoder for MNIST digit classification
- (stacked_autoencoder.py): Stacked auto encoder cost & gradient functions
- (stacked_ae_exercise.py): Classify MNIST digits
Learn features on 8x8 patches of 96x96 STL-10 color images via linear decoder (sparse autoencoder with linear activation function in output layer)
- (linear_decoder_exercise.py)
Classify 64x64 STL-10 images using features learnt via linear decoder (previous section) and convolutional neural networks
- (cnn.py): Convolution neural networks. Convolve & Pooling functions
- (cnn_exercise.py): Classify STL-10 images