Skip to content

Pytorch Solution of Event Extraction Task using BERT on ACE 2005 corpus

License

Notifications You must be signed in to change notification settings

nlpcl-lab/bert-event-extraction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

56 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

bert-event-extraction

Pytorch Solution of Event Extraction Task using BERT on ACE 2005 corpus

Prerequisites

  1. Prepare ACE 2005 dataset.

  2. Use nlpcl-lab/ace2005-preprocessing to preprocess ACE 2005 dataset in the same format as the data/sample.json. Then place it in the data directory as follows:

    ├── data
    │     └── test.json
    │     └── dev.json
    │     └── train.json
    │...
    
  3. Install the packages.

    pip install pytorch==1.0 pytorch_pretrained_bert==0.6.1 numpy
    

Usage

Train

python train.py

Evaluation

python eval.py --model_path=latest_model.pt

Result

Performance

Method Trigger Classification (%) Argument Classification (%)
Precision Recall F1 Precision Recall F1
JRNN 66.0 73.0 69.3 54.2 56.7 55.5
JMEE 76.3 71.3 73.7 66.8 54.9 60.3
This model (BERT base) 63.4 71.1 67.7 48.5 34.1 40.0

The performance of this model is low in argument classification even though pretrained BERT model was used. The model is currently being updated to improve the performance.

Reference

  • Jointly Multiple Events Extraction via Attention-based Graph Information Aggregation (EMNLP 2018), Liu et al. [paper]
  • lx865712528's EMNLP2018-JMEE repository [github]
  • Kyubyong's bert_ner repository [github]

About

Pytorch Solution of Event Extraction Task using BERT on ACE 2005 corpus

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages