Skip to content

Commit

Permalink
Content update
Browse files Browse the repository at this point in the history
Content update at line 607
  • Loading branch information
dotslash21 authored Aug 2, 2019
1 parent 0e1bab5 commit e02a3cb
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion 2- Improving Deep Neural Networks/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -604,7 +604,7 @@ Implications of L2-regularization on:
6. Learning rate decay.
7. Regularization lambda.
8. Activation functions.
9. Adam `beta1` & `beta2`.
9. Adam `beta1`, `beta2` & `epsilon`.
- Its hard to decide which hyperparameter is the most important in a problem. It depends a lot on your problem.
- One of the ways to tune is to sample a grid with `N` hyperparameter settings and then try all settings combinations on your problem.
- Try random values: don't use a grid.
Expand Down

0 comments on commit e02a3cb

Please sign in to comment.