Skip to content

genforce/higan

Repository files navigation

HiGAN - Semantic Hierarchy Emerges in Deep Generative Representations for Scene Synthesis

Python 3.7.6 pytorch 1.4.0 TensorFlow 1.14.0 cuda 10.1 sklearn 0.22.1

image Figure: Scene manipulation from different abstract levels: including layout, categorical object, and scene attributes.

Semantic Hierarchy Emerges in Deep Generative Representations for Scene Synthesis
Ceyuan Yang*, Yujun Shen*, Bolei Zhou
International Journal of Computer Vision (IJCV) 2020

In this repository, we propose an effective framework, termed as HiGAN, to interpret the semantics learned by GANs for scene synthesis. It turns out that GAN models, which employ layer-wise latent codes, spontaneously encode the semantics from different abstract levels in the latent space in a hierarchical manner. Identifying the most relevant variation factors significantly facilitates scene manipulation.

[Paper] [Project Page] [Demo] [Colab-Church] [Colab-Bedroom]

Usage of Semantic Manipulation

A simple example of mainpulting "indoor lighting" of bedroom:

python simple_manipulate.py stylegan_bedroom indoor_lighting

You will get the manipulation results at manipulation_results/stylegan_bedroom_indoor_lighting.html which looks like following. Images can be directly downloaded from the html page.

image

User can also customize their own manipulation tool with script manipulate.py. First, a boundary list is required. See the sample below:

(indoor_lighting, w): boundaries/stylegan_bedroom/indoor_lighting_boundary.npy
(wood, w): boundaries/stylegan_bedroom/wood_boundary.npy

Execute the following command for manipulation:

LAYERS=6-11
python manipulate.py $MODEL_NAME $BOUNDARY_LIST \
    --num=10 \
    --layerwise_manipulation \
    --manipulate_layers=$LAYERS \
    --generate_html

Pre-trained Models

Pre-trained GAN models: GAN Models.

Pre-trained predictors: Predictors.

Train on Your Own Data

Step-1: Synthesize images and get semantic prediction

MODEL_NAME=stylegan_bedroom
OUTPUT_DIR=stylegan_bedroom
python synthesize.py $MODEL_NAME \
    --output_dir=$OUTPUT_DIR \
    --num=500000 \
    --generate_prediction \
    --logfile_name=synthesis.log

Step-2: Boundary search for potential candidates (repeat)

BOUNDARY_NAME=indoor_lighting
python train_boundary.py $OUTPUT_DIR/w.npy $OUTPUT_DIR/attribute.npy \
    --score_name=$BOUNDARY_NAME \
    --output_dir=$OUTPUT_DIR \
    --logfile_name=${BOUNDARY_NAME}_training.log

Step-3: Rescore to identity the most relevant semantics

Use following command to conduct the layer-wise analaysis and identify relevant semantics:

BOUNDARY_LIST=stylegan_bedroom/boundary_list.txt
python rescore.py $MODEL_NAME $BOUNDARY_LIST \
    --output_dir $OUTPUT_DIR \
    --layerwise_rescoring \
    --logfile_name=rescore.log

BibTeX

@article{yang2019semantic,
  title   = {Semantic hierarchy emerges in deep generative representations for scene synthesis},
  author  = {Yang, Ceyuan and Shen, Yujun and Zhou, Bolei},
  journal = {IJCV},
  year    = {2020}
}

About

[IJCV 2020] Semantic Hierarchy Emerges in Deep Generative Representations for Scene Synthesis

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published