Jump to Content

Improved generator objectives for GANs

Jascha Sohl-dickstein
NIPS Workshop on Adversarial Learning (2016)

Abstract

We present a new framework to understand GAN training as alternating density ratio estimation with divergence minimization. This provides a new interpretation for the GAN generator objective used in practice and explains the problem of poor sample diversity. Furthermore, we derive a family of objectives that target arbitrary f-divergences without minimizing a lower bound, and use them to train generative image models that target either improved sample quality or sample diversity.

Research Areas