Sequence-to-sequence attention mechanism models are deep learning models that have achieved a lot of success in tasks like machine translation, text summarization, and image captioning. Google Translate is also using same Attention Mechanism for Translation.
-
Notifications
You must be signed in to change notification settings - Fork 0
Sequence-to-sequence attention mechanism models are deep learning models that does tasks like machine translation, text summarization, and image captioning. Google translate uses the same
License
gyanpra/attention-mechanism
About
Sequence-to-sequence attention mechanism models are deep learning models that does tasks like machine translation, text summarization, and image captioning. Google translate uses the same
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published