Skip to content

Code for the paper "A unifying mutual information view of metric learning: cross-entropy vs. pairwise losses" (ECCV 2020 - Spotlight)

License

Notifications You must be signed in to change notification settings

jeromerony/dml_cross_entropy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Requirements for the experiments

Data management

For In-Shop, you need to manually download the data from https://drive.google.com/drive/folders/0B7EVK8r0v71pVDZFQXRsMDZCX1E (at least the img.zip and list_eval_partition.txt), put them in data/InShop and extract img.zip.

You can download and generate the train.txt and test.txt for every dataset using the prepare_data.py script with:

python prepare_data.py

This will download and prepare all the necessary data for CUB200, Cars-196 and Stanford Online Products.

Usage

This repo uses sacred to manage the experiments. To run an experiment (e.g. on CUB200):

python experiment.py with dataset.cub

You can add an observer to save the metrics and files related to the expriment by adding -F result_dir:

python experiment.py -F result_dir with dataset.cub

Reproducing the results of the paper

CUB200

python experiment.py with dataset.cub model.resnet50 epochs=30 lr=0.02

CARS-196

python experiment.py with dataset.cars model.resnet50 epochs=100 lr=0.05 model.norm_layer=batch

Stanford Online Products

python experiment.py with dataset.sop model.resnet50 epochs=100 lr=0.003 momentum=0.99 nesterov=True model.norm_layer=batch

In-Shop

python experiment.py with dataset.inshop model.resnet50 epochs=100 lr=0.003 momentum=0.99 nesterov=True model.norm_layer=batch

Citation

@inproceedings{boudiaf2020unifying,
  title={A unifying mutual information view of metric learning: cross-entropy vs. pairwise losses},
  author={Boudiaf, Malik and Rony, J{\'e}r{\^o}me and Ziko, Imtiaz Masud and Granger, Eric and Pedersoli, Marco and Piantanida, Pablo and {Ben Ayed}, Ismail},
  booktitle={European Conference on Computer Vision},
  pages={548--564},
  year={2020},
  organization={Springer}
}

About

Code for the paper "A unifying mutual information view of metric learning: cross-entropy vs. pairwise losses" (ECCV 2020 - Spotlight)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages