An PyTorch implementation of WGAN with gradient penalty and TTUR.
- WGAN: https://arxiv.org/abs/1701.07875
- Improved Training of Wasserstein GANs (gradient penalty): https://arxiv.org/abs/1704.00028
- GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium (TTUR): https://arxiv.org/abs/1706.08500
- OS: Ubuntu16.04
- Language: Python
- Packages: torch, torchvision, numpy, tensorflow (for tensorboard)
- Download CelebA from http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html (the Google Drive link)
- Unzip them, and find the zip file with the name
img_align_celeba.zip
, which contains the training data to use
- Unzip them, and find the zip file with the name
- Create a folder:
mkdir data/celeba/
- Unzip the zip file:
unzip img_align_celeba.zip
and put the zipped files under the folder we just created:data/celeba/
Run the following for training
python main.py --dataset celeba --dataroot data/celeba --batch_size 64 --image_size 128 --niter 10000 --exp celeba_experiment
and check the log in tensorboard with
tensorboard --logdir .
Implementation is hugely borrowed from