Skip to content

jonsondag/ufldl_completed

Repository files navigation

Stanford Unsupervised Feature Learning and Deep Learning Tutorial

Tutorial Website: http://ufldl.stanford.edu/wiki/index.php/UFLDL_Tutorial

Sparse Autoencoder

Sparse Autoencoder vectorized implementation, learning/visualizing features on MNIST data

Preprocessing: PCA & Whitening

Implement PCA, PCA whitening & ZCA whitening

Softmax Regression

Classify MNIST digits via softmax regression (multivariate logistic regression)

Self-Taught Learning and Unsupervised Feature Learning

Classify MNIST digits via self-taught learning paradigm, i.e. learn features via sparse autoencoder using digits 5-9 as unlabelled examples and train softmax regression on digits 0-4 as labelled examples

  • (stl_exercise.py): Classify MNIST digits via self-taught learning

Building Deep Networks for Classification (Stacked Sparse Autoencoder)

Stacked sparse autoencoder for MNIST digit classification

  • (stacked_autoencoder.py): Stacked auto encoder cost & gradient functions
  • (stacked_ae_exercise.py): Classify MNIST digits

Linear Decoders with Auto encoders

Learn features on 8x8 patches of 96x96 STL-10 color images via linear decoder (sparse autoencoder with linear activation function in output layer)

  • (linear_decoder_exercise.py)

Working with Large Images (Convolutional Neural Networks)

Classify 64x64 STL-10 images using features learnt via linear decoder (previous section) and convolutional neural networks

  • (cnn.py): Convolution neural networks. Convolve & Pooling functions
  • (cnn_exercise.py): Classify STL-10 images

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published