f-GANs in an information geometric nutshell

Richard Nock, Zac Cranko, Aditya Krishna Menon, Lizhen Qu, Robert C. Williamson

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

12 Citations (Scopus)

Abstract

Nowozin et al showed last year how to extend the GAN principle to all f-divergences. The approach is elegant but falls short of a full description of the supervised game, and says little about the key player, the generator: for example, what does the generator actually converge to if solving the GAN game means convergence in some space of parameters? How does that provide hints on the generator's design and compare to the flourishing but almost exclusively experimental literature on the subject? In this paper, we unveil a broad class of distributions for which such convergence happens - namely, deformed exponential families, a wide superset of exponential families -. We show that current deep architectures are able to factorize a very large number of such densities using an especially compact design, hence displaying the power of deep architectures and their concinnity in the f-GAN game. This result holds given a sufficient condition on activation functions - which turns out to be satisfied by popular choices. The key to our results is a variational generalization of an old theorem that relates the KL divergence between regular exponential families and divergences between their natural parameters. We complete this picture with additional results and experimental insights on how these results may be used to ground further improvements of GAN architectures, via (i) a principled design of the activation functions in the generator and (ii) an explicit integration of proper composite losses' link function in the discriminator.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 30 (NIPS 2017)
EditorsI. Guyon, U.V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, R. Garnett
Place of PublicationSan Diego CA USA
PublisherNeural Information Processing Systems (NIPS)
Number of pages9
Volume30
Publication statusPublished - 2017
Externally publishedYes
EventAdvances in Neural Information Processing Systems 2017 - Long Beach, United States of America
Duration: 4 Dec 20179 Dec 2017
Conference number: 30th
https://dl.acm.org/doi/proceedings/10.5555/3295222 (Proceedings)

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

ConferenceAdvances in Neural Information Processing Systems 2017
Abbreviated titleNIPS 2017
Country/TerritoryUnited States of America
CityLong Beach
Period4/12/179/12/17
Internet address

Cite this