Skip to content

Basic MLP implementation with features for optimization and more

Notifications You must be signed in to change notification settings

adamllryan/multilayer_perceptron_network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

multilayer_perceptron_network

A basic implementation of multilayer perceptron networks with the intent of implementing as many MLP concepts covered in my NN class

Structure

Each layer should be a class that inherits from a base class called Layer.

A layer can be of type: hidden or recurrent.

A hidden layer is a layer that is fully connected to the previous layer and the next layer.

Hidden layers can be of type: sigmoid, tanh, relu, or softmax.

A recurrent layer is a layer that is fully connected to the previous layer and a previous layer.

Recurrent layers can be of type: lstm, gru, or rnn.

About

Basic MLP implementation with features for optimization and more

Resources

Stars

Watchers

Forks

Languages