Skip to content

Latest commit

 

History

History
3 lines (2 loc) · 256 Bytes

README.md

File metadata and controls

3 lines (2 loc) · 256 Bytes

Neural_Machine_Translation

Neural Machine Translation using LSTMs and Attention mechanism. Two approaches were implemented, models, one without out attention using repeat vector, and the other using encoder decoder architecture and attention mechanism.