Skip to content

Commit

Permalink
.
Browse files Browse the repository at this point in the history
  • Loading branch information
mbadry1 committed Feb 3, 2018
1 parent c9f5e7d commit 65d3266
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 1 deletion.
Binary file added 5- Sequence Models/Images/19.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
13 changes: 12 additions & 1 deletion 5- Sequence Models/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,18 @@ Here are the course summary as its given on the course [link](https://www.course
| ... | 0 | newVal |
| was | 1 (I dont need it anymore) | newerVal |
| full | .. | .. |
-
- Drawing for the GRUs
- ![](Images/19.png)
- Drawings like in http://colah.github.io/posts/2015-08-Understanding-LSTMs/ is so popular and makes it easier to understand GRUs and LSTMs. But Andrew Ng finds its better to look at the equations.
- Because the update gate U is usually a small number like 0.00001, GRUs doesn't suffer the vanishing gradient problem.
- In the equation this makes C<sup><t></sup> = C<sup><t-1></sup> in a lot of cases.
- Shapes:
- a<sup><t></sup> shape is (NoOfHiddenNeurons + n<sub>x</sub>, 1)
- C<sup><t></sup> is the same as a<sup><t></sup>
- C<sup>~<t></sup> is the same as a<sup><t></sup>
- U<sup><t></sup> is also the same dimensions of a<sup><t></sup>
- The multiplication in the equations are element wise multiplication.
- What has been descried so var is the Simplified GRU unit. Lets now describe the full one.

### Long Short Term Memory (LSTM)

Expand Down

0 comments on commit 65d3266

Please sign in to comment.