Skip to content

Commit 1a0ce88

Browse files
author
Mofan Zhou
committedJul 30, 2016
update theano TUT
1 parent c589787 commit 1a0ce88

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed
 

‎theanoTUT/theano7_activation_function.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,8 @@
1111
1212
The activation functions include but not limited to softplus, sigmoid, relu, softmax, elu, tanh...
1313
14-
For the hidden layer, we could use relu, tanh...
15-
For classification problems, we could use softplus or softmax for the output layer.
14+
For the hidden layer, we could use relu, tanh, softplus...
15+
For classification problems, we could use sigmoid or softmax for the output layer.
1616
For regression problems, we could use a linear function for the output layer.
1717
1818
"""

0 commit comments

Comments
 (0)
Please sign in to comment.