Skip to content

XuezheMax/wolf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation




Wolf is an open source library for Invertible Generative (Normalizing) Flows.

This is the code we used in the following papers

Decoupling Global and Local Representations via Invertible Generative Flows

Xuezhe Ma, Xiang Kong, Shanghang Zhang and Eduard Hovy

ICLR 2021

MaCow: Masked Convolutional Generative Flow

Xuezhe Ma, Xiang Kong, Shanghang Zhang and Eduard Hovy

NeurIPS 2019

Requirements

  • Python >= 3.6
  • Pytorch >= 1.3.1
  • apex
  • lmdb >= 0.94
  • overrides

Installation

  1. Install NVIDIA-apex.
  2. Install Pytorch and torchvision

Decoupling Global and Local Representations from/for Image Generation

Switch Operation

CelebA-HQ Samples

Running Experiments

First go to the experiments directory:

cd experiments

Training a new CIFAR-10 model:

python -u train.py \
    --config  configs/cifar10/glow-gaussian-uni.json \
    --epochs 15000 --valid_epochs 10
    --batch_size 512 --batch_steps 2 --eval_batch_size 1000 --init_batch_size 2048 \
    --lr 0.001 --beta1 0.9 --beta2 0.999 --eps 1e-8 --warmup_steps 50 --weight_decay 1e-6 --grad_clip 0 \
    --image_size 32 --n_bits 8 \
    --data_path <data path> --model_path <model path>

The hyper-parameters for other datasets are provided in the paper.

Note:

  • Config files, including refined version of Glow and MaCow, are provided here.
  • The argument --batch_steps is used for accumulated gradients to trade speed for memory. The size of each segment of data batch is batch-size / (num_gpus * batch_steps).
  • For distributed training on multi-GPUs, please use distributed.py or slurm.py, and refer to the pytorch distributed parallel training tutorial.
  • Please check details of arguments here.

MaCow: Masked Convolutional Generative Flow

We also implement the MaCow model with distributed training supported. To train a new MaCow model, please use the MaCow config files for different datasets.

References

@InProceedings{decoupling2021,
    title = {Decoupling Global and Local Representations via Invertible Generative Flows},
    author = {Ma, Xuezhe and Kong, Xiang and Zhang, Shanghang and Hovy, Eduard},
    booktitle = {Proceedings of the 9th International Conference on Learning Representations (ICLR-2021)},
    year = {2021},
    month = {May},
}

@incollection{macow2019,
    title = {MaCow: Masked Convolutional Generative Flow},
    author = {Ma, Xuezhe and Kong, Xiang and Zhang, Shanghang and Hovy, Eduard},
    booktitle = {Advances in Neural Information Processing Systems 33, (NeurIPS-2019)},
    year = {2019},
    publisher = {Curran Associates, Inc.}
}

About

Invertible Generative Flows

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages