Skip to content

Releases: huji-nlp/tupa

TUPA v1.4.0

05 Aug 15:00
Compare
Choose a tag to compare

Add support for BERT embeddings and a BERT multilingual pre-trained model.

TUPA v1.3.10

27 Jun 10:53
37d21b3
Compare
Choose a tag to compare
Update version to 1.3.10

TUPA v1.3.9

25 Jun 08:07
c66ff7f
Compare
Choose a tag to compare

Use this command to download all pre-trained models:

wget https://github.com/huji-nlp/tupa/releases/download/v1.3.9/{sparse-1.3.9.tar.gz,sparse-1.3.9-fr.tar.gz,sparse-1.3.9-de.tar.gz,ucca-amr-bilstm-1.3.9.tar.gz,ucca-amr-dm-bilstm-1.3.9.tar.gz,ucca-amr-dm-ud-bilstm-1.3.9.tar.gz,ucca-amr-ud-bilstm-1.3.9.tar.gz,ucca-bilstm-1.3.9.tar.gz,ucca-dm-bilstm-1.3.9.tar.gz,ucca-dm-ud-bilstm-1.3.9.tar.gz,ucca-ud-bilstm-1.3.9.tar.gz,ucca-bilstm-1.3.9-fr.tar.gz,ucca-ud-bilstm-1.3.9-fr.tar.gz,ucca-bilstm-1.3.9-de.tar.gz,ucca-ud-bilstm-1.3.9-de.tar.gz}

TUPA v1.3.6

29 Aug 07:43
e82155f
Compare
Choose a tag to compare
Update version to 1.3.6

TUPA v1.3.3

30 Jun 17:37
f75c0a2
Compare
Choose a tag to compare
  • Bug fixes.

This is the version used for the experiments in the following paper:

@InProceedings{hershcovich2018multitask,
  author    = {Hershcovich, Daniel  and  Abend, Omri  and  Rappoport, Ari},
  title     = {Multitask Parsing Across Semantic Representations},
  booktitle = {Proc. of ACL},
  year      = {2018},
  pages = 	"373--385",
  url       = {https://aclweb.org/anthology/P18-1035}
}

TUPA v1.3.2

10 May 14:21
d6f48b5
Compare
Choose a tag to compare
Update instructions for reproducing experiments with bash scripts

TUPA v1.2

31 Aug 17:21
Compare
Choose a tag to compare
  • Support for CoNLL-U and SDP formats.
  • Identify format per passage without requiring the -f option.
  • Fixes to AMR parsing; add Label as a separate transition to label nodes (rather than doing it immediately upon creation).
  • Support for multi-task training, with a shared BiLSTM input encoding but separate MLP (and labels) per format.
  • Save model hyper-parameters to a .json file rather than Pickle. Models trained with TUPA v1.1 are no longer compatible.
  • Save perceptron weights to a separate Pickle file with a .data suffix.

TUPA v1.1

13 Jul 16:38
Compare
Choose a tag to compare

This release introduces a number of changes from the version used in the ACL 2017 paper, v1.0.

  • Upgrade to DyNet v2.0: this entails a different format for model files, so models trained with TUPA v1.0 are no longer compatible.
  • Easier installation by a simple pip install tupa or python setup.py install.
  • Preliminary support for AMR parsing, by specifying the -f amr option.
  • Functioning demo server (available online).
  • Various bug fixes.

TUPA v1.0

13 Jul 11:19
Compare
Choose a tag to compare

This is the version used for the experiments in the following paper:

@InProceedings{hershcovich2017a,
  author    = {Hershcovich, Daniel  and  Abend, Omri  and  Rappoport, Ari},
  title     = {A Transition-Based Directed Acyclic Graph Parser for {UCCA}},
  booktitle = {Proc. of ACL},
  year      = {2017},
  pages     = {1127--1138},
  url       = {http://aclweb.org/anthology/P17-1104}
}

(Models repaired with sed -i '1s/::archive 14/::archive 12/' models/*.model, see https://stackoverflow.com/a/47194301/223267)