Skip to content

Commit

Permalink
Merge pull request mbadry1#159 from WillyChen123/master
Browse files Browse the repository at this point in the history
A correction in avoidable bias
  • Loading branch information
mbadry1 authored Feb 4, 2019
2 parents 6337cd4 + 336c6e9 commit ec89910
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions 3- Structuring Machine Learning Projects/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,7 @@ Here are the course summary as its given on the course [link](https://www.course
- In the left example, because the human level error is 1% then we have to focus on the **bias**.
- In the right example, because the human level error is 7.5% then we have to focus on the **variance**.
- The human-level error as a proxy (estimate) for Bayes optimal error. Bayes optimal error is always less (better), but human-level in most cases is not far from it.
- You can't do better then Bayes error unless you are overfitting.
- You can't do better than Bayes error unless you are overfitting.
- `Avoidable bias = Training error - Human (Bayes) error`
- `Variance = Dev error - Training error`

Expand Down Expand Up @@ -230,7 +230,7 @@ Here are the course summary as its given on the course [link](https://www.course
- Find better NN architecture/hyperparameters search.
4. If **variance** is large you have these options:
- Get more training data.
- Regularization (L2, Dropout, data augumentation).
- Regularization (L2, Dropout, data augmentation).
- Find better NN architecture/hyperparameters search.


Expand All @@ -252,7 +252,7 @@ Here are the course summary as its given on the course [link](https://www.course

| Image | Dog | Great Cats | blurry | Instagram filters | Comments |
| ------------ | ------ | ---------- | ------- | ----------------- |--------------- |
| 1 || | || Pitbul |
| 1 || | || Pitbull |
| 2 || ||| |
| 3 | | | | |Rainy day at zoo|
| 4 | || | | |
Expand Down Expand Up @@ -283,7 +283,7 @@ Here are the course summary as its given on the course [link](https://www.course
- Apply the same process to your dev and test sets to make sure they continue to come from the same distribution.
- Consider examining examples your algorithm got right as well as ones it got wrong. (Not always done if you reached a good accuracy)
- Train and (dev/test) data may now come from a slightly different distributions.
- It's very important to have dev and test sets to come from the same distribution. But it could be OK for a train set to come from slighly other distribution.
- It's very important to have dev and test sets to come from the same distribution. But it could be OK for a train set to come from slightly other distribution.

### Build your first system quickly, then iterate

Expand Down

0 comments on commit ec89910

Please sign in to comment.