Skip to content

Commit

Permalink
Update transformer.md (d2l-ai#2252)
Browse files Browse the repository at this point in the history
  • Loading branch information
azimjonn authored Aug 17, 2022
1 parent 3f82bce commit 7bfdd40
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Notably,
self-attention
enjoys both parallel computation and
the shortest maximum path length.
Therefore natually,
Therefore naturally,
it is appealing to design deep architectures
by using self-attention.
Unlike earlier self-attention models
Expand Down

0 comments on commit 7bfdd40

Please sign in to comment.