Skip to content

AhmedImtiazPrio/MaGNET

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MaGNET: Uniform Sampling from Deep Generative Network Manifolds Without Retraining, ICLR 2022

PWC Open In Colab

uncurated images StyleGAN2 and BigGAN Fig: Uncurated images generated via Naive and MaGNET sampling on StyleGAN2-FFHQ and BigGAN-ImageNet

Paper Link: https://arxiv.org/abs/2110.08009

ICLR Video Link: https://www.youtube.com/watch?v=0Muk7nKzOW8

Abstract: Deep Generative Networks (DGNs) are extensively employed in Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and their variants to approximate the data manifold and distribution. However, training samples are often distributed in a non-uniform fashion on the manifold, due to costs or convenience of collection. For example, the CelebA dataset contains a large fraction of smiling faces. These inconsistencies will be reproduced when sampling from the trained DGN, which is not always preferred, e.g., for fairness or data augmentation. In response, we develop MaGNET, a novel and theoretically motivated latent space sampler for any pre-trained DGN, that produces samples uniformly distributed on the learned manifold. We perform a range of experiments on various datasets and DGNs, e.g., for the state-of-the-art StyleGAN2 trained on FFHQ dataset, uniform sampling via MaGNET increases distribution precision and recall by 4.1% & 3.0% and decreases gender bias by 41.2%, without requiring labels or retraining. As uniform distribution does not imply uniform semantic distribution, we also explore separately how semantic attributes of generated samples vary under MaGNET sampling.

Google Colabs

Methods Dataset Library  
MaGNET-Stylegan2 FFHQ TF1.15 Link
MaGNET-Stylegan3 AFHQv2 Pytorch1.10 Link
MaGNET-BigGAN ImageNet TF2.8 Link
MaGNET-ProGAN CelebAHQ    
MaGNET-NVAE MNIST    

Release Notes

  1. MaGNET is a plug and play provable method that allows uniform sampling from the learned manifold of any generative model with piecewise affine non-linearities (e.g. LReLU,ReLU). The main contribution of the paper is an expression for the analytical density on the manifold for picewise affine deep generative models.
  2. For SOTA models and generators with complex architectures we see that the direct implication of MaGNET is significant increase in the diversity of a pretrained GAN.
  3. We present in Appendix F Table 1 that by using MaGNET sampling and naive sampling concurrently, one can increase the diversity of sample generation and improve the 1024x1024 FIDFULL of StyleGAN2 config-f FFHQ at different truncations for example:
Truncation ψ % MaGNET FIDFULL
1 0% 2.74
1 4.1% 2.66
.9 0% 5.05
.9 20% 4.29
.7 0% 21.34
.7 100% 19.41
.5 0% 58.33
.5 100% 54.47

Requirements and Usage

Since MaGNET is a Plug and Play method, initially we are making separate google collabs for Tensorflow and Pytorch implementations of StyleGAN2 (TF), BigGAN (TF), ProGAN (TF) and NVAE (Pytorch). The google collab code uses precomputed volume scalars to perform MaGNET sampling therefore it doesn't specifically have library dependencies. We will also be adding submodules into this repo as plug and play examples for Tensorflow(=>1.15) and Pytorch(>=1.5), with methods that compute the volume scalars.

tensorflow-gpu=>1.15
# or
pytorch>=1.5

Additional Materials

Citation

@inproceedings{
humayun2022magnet,
title={MaGNET: Uniform Sampling from Deep Generative Network Manifolds Without Retraining},
author={Ahmed Imtiaz Humayun and Randall Balestriero and Richard Baraniuk},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=r5qumLiYwf9}
}

To request additional materials or for questions, please contact Ahmed Imtiaz Humayun at imtiaz@rice.edu. We eagerly welcome contributions, please open a pull request to add codes/collabs for models other than the ones currently in the repository.

About

Official repository for MaGNET, ICLR 2022

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages