Skip to content

joe-arul/machine_translation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Machine Translation (with Attention mechanism)

In this assignment I explore different strategies used in building and training a Language Translator. I use Seq2Seq learning to convert sequences from English to Hebrew. I also include techniques such as bi-directional learning and Attention mechanism which serve as the building blocks of advanced transformer-based NLP models such as GPT, Llama etc. .

Language Choice and Details

I initially wanted to build a model to translate English to Tamil which happens to be my Native Language so that I could work with the translations easily. But due to less number of available sentence pairs (207) on the https://www.manythings.org/anki/ website for this language (the website used in my class) , I picked Hebrew.

Though Hebrew has been a language of fascination for me for a while now, there are a few other important reasons I picked Hebrew for learning machine translation.

LexicalDistance Image Source: https://alternativetransport.wordpress.com/2015/05/05/34/

Strategies Used:

  • Used Hebrew Tokenizer (github.com/YontiLevin/Hebrew-Tokenizer) to parse and clean Hebrew text (Canonical normalization used for English does not work)
  • Converted text to sequences (reversed sequence list for Hebrew)
  • Models : Seq2Seq, Bi-directional Seq2Seq (Bi-LSTM), Seq2Seq with Attention layer
  • Tuning: Latent dimensions, training epochs, dropout, activation functions
  • Sampling: Greedy and Multinomial (with different temperatures)

Link to Notebook

About

Seq2Seq Model architectures for Language Translation on Right-to-Left/DextroSinistral Languages

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published