Skip to content

Latest commit

 

History

History
10 lines (7 loc) · 552 Bytes

README.md

File metadata and controls

10 lines (7 loc) · 552 Bytes

TTT - Train, Transform, Translate

Deep Learning Fall 2019 Final Project

Group Big Brain Learning 🤔

The Universal Transformer model can be found in universal_transformer.py. Scripts to process the data can be found in preprocess.py. The script to train the model is assignment.py. It produces two output files that can be used to compute BLEU scores using bleu.sh. Note that bleu.sh requires the sacrebleu Python package, which can be installed with pip install sacrebleu.

Included are .sh files to download the necessary data.