Skip to content

lemmonation/fcl-nat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FCL-NAT

Code for our AAAI 2020 paper:

Fine-Tuning by Curriculum Learning for Non-Autoregressive Neural Machine Translation
Junliang Guo, Xu Tan, Linli Xu, Tao Qin, Enhong Chen, Tie-Yan Liu

Note

The code is mainly based on tensor2tensor and is tested under the following environments:

  • tensorflow == 1.4
  • tensor2tensor == 1.2.9

The core logic of our model is in tensor2tensor/model/transformer_nat_cl_word.py. We also provide a sample training script of our model in scripts/train_nat_distill_wmt_ende_cl_word_set0.sh.

The original version of this code is written by Zhuohan Li. We thank them a lot for sharing the code.

Citation

@article{guo2019finetuning,
    title={Fine-Tuning by Curriculum Learning for Non-Autoregressive Neural Machine Translation},
    author={Guo, Junliang and Tan, Xu and Xu, Linli and Qin, Tao and Chen, Enhong and Liu, Tie-Yan},
    journal={arXiv preprint arXiv:1911.08717},
    year={2019}
}

About

Code for "Fine-Tuning by Curriculum Learning for Non-Autoregressive Neural Machine Translation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published