Skip to content

PyTorch implementation of the AAAI-21 paper "Dual Adversarial Label-aware Graph Neural Networks for Cross-modal Retrieval" and the TPAMI-22 paper "Integrating Multi-Label Contrastive Learning with Dual Adversarial Graph Neural Networks for Cross-Modal Retrieval".

License

LivXue/GNN4CMR

Repository files navigation

Integrating Multi-Label Contrastive Learning with Dual Adversarial Graph Neural Networks for Cross-Modal Retrieval

This repository contains the author's implementation in PyTorch for the AAAI-21 paper "Dual Adversarial Label-aware Graph Neural Networks for Cross-modal Retrieval" and the TPAMI-22 paper "Integrating Multi-Label Contrastive Learning with Dual Adversarial Graph Neural Networks for Cross-Modal Retrieval".

Dependencies

  • Python (>=3.8)

  • PyTorch (>=1.7.1)

  • Scipy (>=1.5.2)

Datasets

You can download the features of the datasets from:

Implementation

Here we provide the implementation of our proposed models, along with datasets. The repository is organised as follows:

  • data/ contains the necessary dataset files for NUS-WIDE, MIRFlickr, and MS-COCO;
  • models.py contains the implementation of the P-GNN-CON and I-GNN-CON;

Finally, main.py puts all of the above together and can be used to execute a full training run on MIRFlcikr or NUS-WIDE or MS-COCO.

Process

  • Place the datasets in data/
  • Set the experiment parameters in main.py.
  • Train a model:
python main.py
  • Modify the parameter EVAL = True in main.py for evaluation:
python main.py

Citation

If you find our work or the code useful, please consider cite our paper using:

@article{Qian_Xue_Zhang_Fang_Xu_2021, 
  title={Dual Adversarial Graph Neural Networks for Multi-label Cross-modal Retrieval}, 
  volume={35}, 
  number={3}, 
  journal={Proceedings of the AAAI Conference on Artificial Intelligence}, 
  author={Qian, Shengsheng and Xue, Dizhan and Zhang, Huaiwen and Fang, Quan and Xu, Changsheng}, 
  year={2021}, 
  pages={2440-2448} 
}
@article{9815553, 
  title={Integrating Multi-Label Contrastive Learning With Dual Adversarial Graph Neural Networks for Cross-Modal Retrieval}, 
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
  author={Qian, Shengsheng and Xue, Dizhan and Fang, Quan and Xu, Changsheng}, 
  year={2022},  
  pages={1-18},  
  doi={10.1109/TPAMI.2022.3188547}
}

About

PyTorch implementation of the AAAI-21 paper "Dual Adversarial Label-aware Graph Neural Networks for Cross-modal Retrieval" and the TPAMI-22 paper "Integrating Multi-Label Contrastive Learning with Dual Adversarial Graph Neural Networks for Cross-Modal Retrieval".

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages