Built a neural network (NN) using only NumPy for efficient array manipulation.
The NN consists of: an input layer (784 neurons), one hidden layer (100 neurons), output layer (10 neurons).
Implemented the tanh and softmax functions as activation functions for the hidden and output layers.
Used both forward and backward propagation to train the neural network.
Calculated the squared loss between expected and predicted values.
Calculated gradient descent to optimize the network parameters.
Accuracy reaches 97.5%
-
Notifications
You must be signed in to change notification settings - Fork 0
License
HoiyeungNg/Digit_Recognizer
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published