Neural Machine Translation using LSTMs and Attention mechanism. Two approaches were implemented, models, one without out attention using repeat vector, and the other using encoder decoder architecture and attention mechanism.
-
Notifications
You must be signed in to change notification settings - Fork 0
Neural Machine Translation using LSTMs and Attention mechanism. Two approaches were implemented, models, one without out attention using repeat vector, and the other using encoder decoder architecture and attention mechanism.
gionanide/Neural_Machine_Translation
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Neural Machine Translation using LSTMs and Attention mechanism. Two approaches were implemented, models, one without out attention using repeat vector, and the other using encoder decoder architecture and attention mechanism.
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published