Skip to content

onehaitao/R-BERT-relation-extraction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

R-BERT-relation-extraction

PWC

Implementation of Enriching Pre-trained Language Model Entity Information for Relation Classification.

Environment Requirements

  • python 3.6.9
  • pytorch 1.5.1+cu92
  • transformers 2.11.0
  • tqdm 4.40.1

Device

  • TITAN Xp
  • CUDA Version 9.0.176

Data

Usage

  1. Download the pre-trained BERT model and put it into the resource folder.
  2. Run the following the commands to start the program.
python run.py \
    --batch_size=16 \
    --max_len=128 \
    --lr=2e-5 \
    --epoch=5 \
    --dropout=0.1

More details can be seen by python run.py -h.

  1. You can use the official scorer to check the final predicted result (in the eval folder).
perl semeval2010_task8_scorer-v1.2.pl proposed_answer.txt predicted_result.txt >> result.txt

Result

The result of my version and that in paper are present as follows:

paper my version
0.8925 0.8906

The training log can be seen in train.log and the official evaluation results is available in result.txt.

Note:

  • Some settings may be different from those mentioned in the paper.
  • No validation set used during training.

Reference Link

About

Implementation of Enriching Pre-trained Language Model Entity Information for Relation Classification.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages