Skip to content

Factual consistency checking model for abstractive summaries (NAACL-22 Findings)

Notifications You must be signed in to change notification settings

hwanheelee1993/MFMA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MFMA

This repository provides a factual consistency checking model from our NAACL-Findings 2022 paper Masked Summarization to Generate Factually Inconsistent Summaries for Improved Factual Consistency Checking.

1. Usage of pre-trained factual consistency checking model

MFMA is a pre-trained factual consistency checking model(trained as a binary classifier) for abstractive summaries trained with the augmented negative samples using Mask-and-Fill with Article(MFMA).

You only need huggingface transformers library to load the pre-trained model.

from transformers import AutoModelforSequenceClassification
model = AutoModelforSequenceClassification("henry931007/mfma")

2. Training MFMA Instructions

Usage

1) Install Prerequisites

Create a python 3.8 environment and then install the requirements.

Install packages using "requirements.txt"

conda create -name msm python=3.8
pip install -r requirements.txt

2) Training MFMA

python train_fb.py --mask_ratio1 $MASK_ARTICLE \
                   --mask_ratio2 $MASK_ARTICLE \

3) Generating Negative Summaries with MFMA

python infer_fb.py --mask_ratio1 $MASK_ARTICLE \
                   --mask_ratio2 $MASK_ARTICLE \

4) Training Factual Consistency Checking Model using the Data

python train_metric.py --datadir $DATA_PATH \

References

@inproceedings{lee2022mfma,
      title={Masked Summarization to Generate Factually Inconsistent Summaries for Improved Factual Consistency Checking}, 
      author={Hwanhee Lee and Kang Min Yoo and Joonsuk Park and Hwaran Lee and Kyomin Jung},
      year={2022},
      month={july},
      booktitle={Findings of the Association for Computational Linguistics: NAACL 2022},
}

About

Factual consistency checking model for abstractive summaries (NAACL-22 Findings)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages