Skip to content

Latest commit

 

History

History
41 lines (31 loc) · 1.28 KB

File metadata and controls

41 lines (31 loc) · 1.28 KB

RuleBERT + CheckList

This file shows how to apply RuleBERT on CheckList.

Recipe

  1. Download RuleBERT Model
bash download_model.sh
  1. Fine-tune on some CheckList Rules
python trainer.py --data-dir data/external_datasets/CheckList \
                  -- model_arch models/rulebert_161 \
                  --epochs 3 \
                  --verbose \
                  --hard_rule
  1. Fine-tune on QQP dataset

    We use a HuggingFace notebook for fine-tuning on QQP. The notebook can be found here.

    Make sure to load the RuleBERT encoder with an untrained CLS layer:

    from transformers import RobertaForSequenceClassification
    
    #Load RuleBERT
    rulebert_cls = RobertaForSequenceClassification.from_pretrained("models/rulebert_161")
    
    #Load Roberta_MLM
    roberta_cls = RobertaForSequenceClassification.from_pretrained('roberta-large')
    
    # Transfer encoder
    roberta_cls.roberta = rulebert_cls.roberta
  2. Apply Checklist

    The repository can be found here.

License

MIT