Skip to content

This repository is a collection of lab practices where I implemented Activation Layers, Machine Learning, Deep Learning models during the Introduction of Deep Neural Networks course.

Notifications You must be signed in to change notification settings

Jiyu-Kim/DeepNeuralNetworks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

header

📌 Introduction

This repository is a collection of lab practices where I implemented Activation Layers, Machine Learning, Deep Learning models during the Introduction of Deep Neural Networks course.

💡 What I learned

  • Machine Learning Basics, Deep Neural Networks Basics
  • Overfitting and Model Generalization
  • Evaluation Protocol and Metrics
  • Backpropagation Algorithm
  • Optimization
  • Convolutional Neural Networks (LeNet-5, AlexNet, VGGNet, GoogLeNet, ResNet)
  • Recurrent Neural Networks (LSTM, Seq2Seq, Attention Mechanism, BERT, Transformer)
  • Deep Gernerative Models
  • Generative Adversarial Networks
  • How to implement Activiaton Layers, ML alogorithms with Numpy
  • How to implement models with PyTorch

💻 Contents

  • DNN_HW1
    • Implement Linear Regression with Numpy library (models/LinearRegression.py)
    • Implement Logistic Regression Numpy library (models/LogisticRegression.py)
    • Tune # of training epochs and learning rate to minimize MSE for 'Graduate' and 'Concrete' datasets.

  • DNN_HW2
    • Implement activation layers(sigmoid, ReLU, tanh) with Numpy library. (Answer.py)
    • Implement fully-connected layer(FCLayer) with Numpy library. (Answer.py)
    • Implement the softmax layer* with Numpy library. (Answer.py)
    • Report test acuuracy on MNIST using three different activation fuctions with a given DNN architecture and parameters.
    • [Random Search on FashioMNIST] Optimize model architecture(# of hidden layers, # of hidden nodes, # of epochs, learning rate etc.) to achieve best results on FashionMNIST.

  • DNN_HW3
    • Implement Multi Layer Perceptron Classifier with PyTorch framework. (model/MLP_classifier.py)
    • Implement Multi Layer Perceptron Regressor with PyTorch framework. (model/MLP_regressor.py)
    • [Random Search on House] Optimize model architecture(# of hidden layers, # of hidden nodes, # of epochs, learning rate etc.) to achieve best results on House Dataset.
    • [Random Search on FashioMNIST] Optimize model architecture(# of hidden layers, # of hidden nodes, # of epochs, learning rate etc.) to achieve best results on FashionMNIST.

  • DNN_HW4
    • Implement AlexNet with PyTorch framework. (models/ALexNet.py)
    • Implement ResNet with PyTorch framework. (models/ResNet.py)
    • [Random Search on MNIST] Optimize model architecture(# of hidden layers, # of hidden nodes, # of epochs, learning rate etc.) to achieve best results on MNIST.
    • [Random Search on FashioMNIST] Optimize model architecture(# of hidden layers, # of hidden nodes, # of epochs, learning rate etc.) to achieve best results on FashionMNIST.

  • project
    • Conduct Semi-supervised learning for image classification.
    • Build a machine learning model for image classification, where a few data are only labeled and most of the data are unlabeled.

🔧 Tech Stack

Language

Framework



About

This repository is a collection of lab practices where I implemented Activation Layers, Machine Learning, Deep Learning models during the Introduction of Deep Neural Networks course.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published