Skip to content

Commit

Permalink
difference between perceptron and sigmoid
Browse files Browse the repository at this point in the history
  • Loading branch information
vernikagupta authored Jul 5, 2019
1 parent 4ffa960 commit e1d6e26
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion 1- Neural Networks and Deep Learning/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,9 +70,14 @@ Here are the course summary as its given on the course [link](https://www.course
### What is a (Neural Network) NN?

- Single neuron == linear regression without applying activation(perceptron)
- Basically a single neuron will calculate weighted sum of input(W.T*X) and then we can set a threshold to classify in a perceptron.
- Basically a single neuron will calculate weighted sum of input(W.T*X) and then we can set a threshold to predict output in a perceptron. If weighted sum of input cross the threshold, perceptron fires and if not then perceptron doesn't predict.
- Perceptron can take real values input or boolean values.
- Actually, when w⋅x+b=0 the perceptron outputs 0.
- Disadvantage of perceptron is that it only output binary values and if we try to give small change in weight and bais then perceptron can flip the output. We need some system which can modify the output slightly according to small change in weight and bias. Here comes sigmoid function in picture.
- If we change perceptron with a sigmoid function, then we can make slight change in output.
- e.g. output in perceptron = 0, you slightly changed weight and bias, output becomes = 1 but actual output is 0.7. In case of sigmoid, output1 = 0, slight change in weight and bias, output = 0.7.
- If we apply sigmoid activation function then Single neuron will act as Logistic Regression.
- we can understand difference between perceptron and sigmoid function by looking at sigmoid function graph.

- Simple NN graph:
- ![](Images/Others/01.jpg)
Expand Down

0 comments on commit e1d6e26

Please sign in to comment.