Skip to content

Commit

Permalink
Update the ReLU Link (tensorflow#498)
Browse files Browse the repository at this point in the history
Dan Hurt found that the ReLU link was not pointing to the correct place.
  • Loading branch information
ematejska authored Jul 6, 2020
1 parent d211f96 commit 910c442
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/site/tutorials/model_training_walkthrough.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -541,7 +541,7 @@
"id": "fK0vrIRv_tcc"
},
"source": [
"The activation function determines the output shape of each node in the layer. These non-linearities are important—without them the model would be equivalent to a single layer. There are many available activations, but [ReLU](https://www.tensorflow.org/swift/api_docs/Functions#/s:10TensorFlow4reluyAA0A0VyxGAESFRzAA0aB6ScalarRzlF) is common for hidden layers.\n",
"The activation function determines the output shape of each node in the layer. These non-linearities are important—without them the model would be equivalent to a single layer. There are many available activations, but [ReLU](https://www.tensorflow.org/swift/api_docs/Functions#relu_:) is common for hidden layers.\n",
"\n",
"The ideal number of hidden layers and neurons depends on the problem and the dataset. Like many aspects of machine learning, picking the best shape of the neural network requires a mixture of knowledge and experimentation. As a rule of thumb, increasing the number of hidden layers and neurons typically creates a more powerful model, which requires more data to train effectively."
]
Expand Down

0 comments on commit 910c442

Please sign in to comment.