Skip to content

Commit

Permalink
Typo fixes (keras-team#12030)
Browse files Browse the repository at this point in the history
* Typo fixes
  • Loading branch information
eyalzk authored and gabrieldemarmiesse committed Jan 12, 2019
1 parent fd537c7 commit 0418141
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion examples/lstm_seq2seq.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
- A decoder LSTM is trained to turn the target sequences into
the same sequence but offset by one timestep in the future,
a training process called "teacher forcing" in this context.
Is uses as initial state the state vectors from the encoder.
It uses as initial state the state vectors from the encoder.
Effectively, the decoder learns to generate `targets[t+1...]`
given `targets[...t]`, conditioned on the input sequence.
- In inference mode, when we want to decode unknown input sequences, we:
Expand Down
2 changes: 1 addition & 1 deletion keras/callbacks.py
Original file line number Diff line number Diff line change
Expand Up @@ -633,7 +633,7 @@ class ModelCheckpoint(Callback):
"""Save the model after every epoch.
`filepath` can contain named formatting options,
which will be filled the value of `epoch` and
which will be filled with the values of `epoch` and
keys in `logs` (passed in `on_epoch_end`).
For example: if `filepath` is `weights.{epoch:02d}-{val_loss:.2f}.hdf5`,
Expand Down

0 comments on commit 0418141

Please sign in to comment.