Skip to content

zfying/visfis

Repository files navigation

VisFIS

VisFIS: Visual Feature Importance Supervision with Right-for-the-Right-Reason Objectives

Zhuofan Ying*, Peter Hase*, Mohit Bansal

main-fig

Setup environment and data

Environment

Create and activate conda environment:

conda create -n visfis python=3.6
conda activate visfis

Install the dependencies with:

pip install -r requirements.txt

Setup data

  • Download gdrive to path_to_gdrive for running the scripts. Alternatively, you can also download data from google drive manually.
  • Inside scripts/common.sh, edit PROJ_DIR variable by assigning it the project path.

For CLEVR-XAI

Download data for XAI-CP (put path_to_gdrive as the first argument):

./scripts/download/download_xai.sh ${path_to_gdrive}

Preprocess the data:

./scripts/preprocessing/preprocessing_xai.sh

For VQA-HAT

Download data for HAT-CP (put path_to_gdrive as the first argument):

./scripts/download/download_vqa.sh ${path_to_gdrive}

Preprocess the data:

./scripts/preprocessing/preprocessing_vqa.sh

For GQA

Download data for GQA-CP (put path_to_gdrive as the first argument):

./scripts/download/download_gqa.sh ${path_to_gdrive}

Preprocess the data:

./scripts/preprocessing/preprocessing_gqa.sh

Training and Testing

  • Run scripts in scripts/baseline/, and scripts/visfis/ to train models and calculate metrics. Put dataset name as the first argument chosen from xaicp, hatcp, and gqacp. Put GPU number as the second argument. For example, to reproduce results from the main table on xaicp, execute:
./scripts/baseline/baseline_updn.sh xaicp 0
./scripts/visfis/visfis_updn.sh xaicp 1
  • To train with random supervision, change the value of --hint_type parameter in scripts to hints_random.
  • Scripts for tuning and reproducing other SOTA results can be found in scripts/all/.

Data Analysis

The analysis/ directory contains R scripts that read .pkl files of metrics and conduct data analysis.

Acknowledgement

This code used resources from negative analysis of grounding, ramen, and bottom-up-attention-vqa .

Citation

If you find this code useful for your research, please consider citing:

@inproceedings{ying2022visfis,
  title={VisFIS: Visual Feature Importance Supervision with Right-for-the-Right-Reason Objectives},
  author = {Ying, Zhuofan and Hase, Peter and Bansal, Mohit},
  booktitle={arXiv},
  year={2022}
}

About

A pytorch implementation of "VisFIS: Improved Visual Feature Importance Supervision with Right-for-Right-Reason Objectives"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published