Skip to content

Commit

Permalink
typo
Browse files Browse the repository at this point in the history
Co-Authored-By: Mike J Innes <[email protected]>
  • Loading branch information
DhairyaLGandhi and MikeInnes authored Oct 10, 2019
1 parent 623ee2c commit a558784
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/src/training/optimisers.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ Flux internally calls on this function via the `update!` function. It shares the

## Composing Optimisers

Flux defines a special kind of optimiser called simply as `Optimiser` which takes in a arbitrary optimisers as input. Its behaviour is similar to the usual optimisers, but differs in that it acts by calling the optimsers listed in it sequentially. Each optimiser produces a modified gradient
Flux defines a special kind of optimiser called simply as `Optimiser` which takes in a arbitrary optimisers as input. Its behaviour is similar to the usual optimisers, but differs in that it acts by calling the optimisers listed in it sequentially. Each optimiser produces a modified gradient
that will be fed into the next, and the resultant update will be applied to the parameter as usual. A classic use case is where adding decays is desirable. Flux defines some basic decays including `ExpDecay`, `InvDecay` etc.

``julia
Expand Down

0 comments on commit a558784

Please sign in to comment.