Skip to content

acmi-lab/Latent-Label-Shift-DDFA

Repository files navigation

Unsupervised Learning under Latent Label Shift

A new approach to unsupervised learning leveraging domain structure and invariance.

Figure 1

NeurIPS 2022

Paper as Roberts*, Mani*, Garg, and Lipton.

ICML 2022 SCIS Workshop

Paper as Mani*, Roberts*, Garg, and Lipton.

SlidesLive Poster Session Video

Authors

Pranav Mani*1 pmani@andrew.cmu.edu

Manley Roberts*1 manleyroberts@cmu.edu, manley@abacus.ai

Saurabh Garg1 sgarg2@andrew.cmu.edu

Zachary C. Lipton1 zlipton@cmu.edu

*: Denotes equal contribution 1: Machine Learning Department, Carnegie Mellon University

Use Instructions

  • Install a recent version of Python 3.
  • pip install -r requirements.txt
  • Install ImageNet by the instructions at https://www.image-net.org/download.php and replace 'root folder' in ImageNet and ImageNetSubset classes in dataset.py with the root folder of the installation (one level above the train/validation split folders). The test dataset we use is composed of the validation dataset from ImageNet, the validation dataset is split out of the train dataset of ImageNet.
  • Details on downloading the FieldGuide dataset can be found here https://sites.google.com/view/fgvc6/competitions/butterflies-moths-2019. Extract images from training.rar into '~/FieldGuideAllImagesDownload/'. Then run ./data_utils/create_FieldGuide_directories.ipynb to create the FieldGuide-28 and FieldGuide-2 train, val and test directories.
  • In experiment_config.yml, replace "project" and "entity" with the appropriate project and entity for WandB.

Attributions

Attributions are available in LICENSE_ATTRIBUTION

About

Code and results accompanying our paper titled Unsupervised Learning under Latent Label Shift at NeurIPS 2022

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published