Skip to content

Commit

Permalink
Merge pull request mbadry1#166 from vernikagupta/master
Browse files Browse the repository at this point in the history
Added Perceptron
  • Loading branch information
mbadry1 authored Jul 8, 2019
2 parents c6f2c68 + e1d6e26 commit 3a19e14
Showing 1 changed file with 10 additions and 1 deletion.
11 changes: 10 additions & 1 deletion 1- Neural Networks and Deep Learning/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,16 @@ Here are the course summary as its given on the course [link](https://www.course
### What is a (Neural Network) NN?

- Single neuron == linear regression
- Single neuron == linear regression without applying activation(perceptron)
- Basically a single neuron will calculate weighted sum of input(W.T*X) and then we can set a threshold to predict output in a perceptron. If weighted sum of input cross the threshold, perceptron fires and if not then perceptron doesn't predict.
- Perceptron can take real values input or boolean values.
- Actually, when w⋅x+b=0 the perceptron outputs 0.
- Disadvantage of perceptron is that it only output binary values and if we try to give small change in weight and bais then perceptron can flip the output. We need some system which can modify the output slightly according to small change in weight and bias. Here comes sigmoid function in picture.
- If we change perceptron with a sigmoid function, then we can make slight change in output.
- e.g. output in perceptron = 0, you slightly changed weight and bias, output becomes = 1 but actual output is 0.7. In case of sigmoid, output1 = 0, slight change in weight and bias, output = 0.7.
- If we apply sigmoid activation function then Single neuron will act as Logistic Regression.
- we can understand difference between perceptron and sigmoid function by looking at sigmoid function graph.

- Simple NN graph:
- ![](Images/Others/01.jpg)
- Image taken from [tutorialspoint.com](http://www.tutorialspoint.com/)
Expand Down

0 comments on commit 3a19e14

Please sign in to comment.