Skip to content

isl-mt/NMTGMinor

 
 

Repository files navigation

Transformer networks for Neural Machine Translation

This is an implementation of the transformer for the paper

"Attention is all you need"

  • Features supported:
  • Multi-layer transformer encoder-decoder networks
  • Multi-GPU / Single GPU training (mGPU is outdated for now)
  • Checkpointing models for better Memory/Speed trade-off
  • Research ideas
  • The code is based on several modules (Dictionary and Loss functions) of "OpenNMT-py"

About

A Neural Machine Translation toolkit for research purpose

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%