Skip to content

Francis-Hsu/WGAN-TF2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

WGAN-TF2

Yet another simple implementation of GAN and Wasserstein GAN using TensorFlow 2.

Remarks

Following advice from the original GAN paper [Goo+14], we trained G by maximizing ln[D(G(z))] to avoid vanishing gradient.

We implemented the WGAN-LP variant [PFL17] in place of the ordinary WGAN [ACB17]. The gradient penalty is computed by perturbing the concatenation of real and fake data with Gaussian noises.

Usage

  • To train GAN on MNIST:

    python main.py --model GAN --dataset MNIST
  • To train WGAN on CIFAR-10:

    python main.py --model WGAN --dataset CIFAR10
  • To see available parameters and explaninations:

    python main.py --help

Demo

For the MNIST dataset, we picked 16 seeds (respectively for both models) to showcase how the generators evolve. We also randomly generated 100 samples after 100 epochs of training. The (median) objective function values for GAN and WGAN are displayed at the end.

GAN WGAN
portfolio_view portfolio_view
GAN_MNIST WGAN_MNIST
portfolio_view portfolio_view

Likewise, here we can see how the models evolve when training on the CIFAR-10 dataset. The generators turn out can be quite unstable for this data, fine tuning is therefore needed in order to get decent results.

GAN WGAN
portfolio_view portfolio_view
GAN_CIFAR10 WGAN_CIFAR10

Acknowledgment

The following tutorials/repos have provided immense help for this implementation:

References

  • Martin Arjovsky, Soumith Chintala, and Léon Bottou. Wasserstein GAN. 2017. arXiv: 1701.07875 [stat.ML].

  • Ian Goodfellow et al. Generative Adversarial Nets. 2014. In: Advances in Neural Information Processing Systems 27.

  • Ishaan Gulrajani et al. Improved Training of Wasserstein GANs. 2017. arXiv: 1704.00028 [cs.LG].

  • Naveen Kodali et al. On Convergence and Stability of GANs. 2017. arXiv: 1705.07215 [cs.AI].

  • Henning Petzka, Asja Fischer, and Denis Lukovnicov. On the regularization of Wasserstein GANs. 2017. arXiv: 1709.08894 [stat.ML].

  • Tim Salimans et al. Improved Techniques for Training GANs. 2016. arXiv: 1606.03498 [cs.LG].

Releases

No releases published

Packages

No packages published

Languages