Skip to content

MT/IE: Cross-lingual Open Information Extraction with Neural Sequence-to-Sequence Models

Notifications You must be signed in to change notification settings

sheng-z/cross-lingual-open-ie

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MT/IE: Cross-lingual Open IE

Attention-based sequence-to-sequence model for cross-lingual open IE.

Summary

A tensorflow implementation of "MT/IE: Cross-lingual Open Information Extraction with Neural Sequence-to-Sequence Models" (EACL 2017) by Sheng Zhang, Kevin Duh, and Benjamin Van Durme.

Dependencies

  • python 2.7
  • tensorflow r0.12 or later

Train

We provide you a small toy dataset (10K) to play with. To start training on this dataset, simply run:

./run.sh

Evaluate

After training for a while, you can start evaluation by:

python -m mt_ie --do_decode=True

Note: multi-bleu.perl from mosesdecoder is included for your convenience.

About

MT/IE: Cross-lingual Open Information Extraction with Neural Sequence-to-Sequence Models

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published