Skip to content

Latest commit

 

History

History
15 lines (14 loc) · 2.51 KB

ML_GAN.md

File metadata and controls

15 lines (14 loc) · 2.51 KB

ML - Generative Adversatial Network (GAN)

Paper Conference Remarks
Generative Adversarial Nets NIPS 2014 A new framework for estimating generative models via an adversarial process, in which they simultaneously train two models: a generative model G that captures the data distribution, and a discriminative model D that estimates the probability that a sample came from the training data rather than G.
Improved Techniques for Training GANs NIPS 2016 Presents a variety of new architectural features and training procedures that weapply to the generative adversarial networks (GANs) framework.
Tutorial - Generative Adversatial Network NIPS Workshop 2016 1. Why generative modeling is a topic worth studying. 2. How generative models work, and how GANs compare to other generative models. 3. The details of how GANs work. 4. Research frontiers in GANs. 5. State-of-the-art image models that combine GANs with other methods
Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks ICLR 2016 Introduces a class of CNNs called deep convolutional generative adversarial networks (DCGANs), that have certain architectural constraints, and demonstrate that they are a strong candidate for unsupervised learning
Generative Adversarial Networks: An Overview IEEE SPM 2017 1. Provides an overview of GANs for the signal processing community, drawing on familiar analogies and concepts where possible. 2. Points out remaining challenges.
Image-to-Image Translation with Conditional Adversarial Networks CVPR 2017 1. Cast conditional adversarial networks as a general-purpose solution to image-to-image translation problems. 2. This approach is effective at synthesizing photos from label maps, reconstructing objects from edge maps, and colorizing images, among other tasks.
Wasserstein GAN ICML 2017 1. It uses Wasserstein distance as distances between two distributions to ensure continuity and differentiability. 2. WGAN can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches.

Back to index