Skip to content

MingtaoGuo/BigGAN-tensorflow

Repository files navigation

BigGAN-tensorflow

Reimplementation of the Paper: Large Scale GAN Training for High Fidelity Natural Image Synthesis

Introduction

Simply implement the great paper (BigGAN)Large Scale GAN Training for High Fidelity Natural Image Synthesis, which can generate very realistic images. However, due to my poor device 😭, I just train the image of size 32x32 of cifar-10 and the image of size 64x64 of Imagenet64. By the way, the training procedure is really slow.

From the paper:

Dataset

  1. Image 32x32: cifar-10: http://www.cs.toronto.edu/~kriz/cifar-10-matlab.tar.gz
  2. Image 64x64: ImageNet64: https://drive.google.com/open?id=1uN9O69eeqJEPV797d05ZuUmJ23kGVtfU

Just download the dataset, and put them into the folder 'dataset'

Architecture

Results

32x32 Cifar-10

Configuration:

Training iteration: 100,000 Truncation threshold: 1.0

Discriminator Generator
Update step 2 1
Learning rate 4e-4 1e-4
Orthogonal reg ✔️ ✔️
Orthogonal init ✔️ ✔️
Hierarchical latent ✔️
Projection batchnorm ✔️
Truncation threshold ✔️

Generation:

Truncation threshold = 1.0, A little mode collapse (truncation threshold is too small).

Truncation threshold = 2.0.

car2plane ship2horse cat2bird

64x64 ImageNet

Configuration:

Training iteration: 100,000

Discriminator Generator
Update step 2 1
Learning rate 4e-4 1e-4
Orthogonal reg ✔️ ✔️
Orthogonal init ✔️ ✔️
Hierarchical latent ✔️
Projection batchnorm ✔️
Truncation threshold ✔️

Iteration: 30,000 Iteration: 60,000 Under training ..........

To be continue.

About

Reimplementation of the Paper: Large Scale GAN Training for High Fidelity Natural Image Synthesis

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages