A basic implementation of multilayer perceptron networks with the intent of implementing as many MLP concepts covered in my NN class
Each layer should be a class that inherits from a base class called Layer.
A layer can be of type: hidden or recurrent.
A hidden layer is a layer that is fully connected to the previous layer and the next layer.
Hidden layers can be of type: sigmoid, tanh, relu, or softmax.
A recurrent layer is a layer that is fully connected to the previous layer and a previous layer.
Recurrent layers can be of type: lstm, gru, or rnn.