可分离空间
最大值和最小值
编码(集合论)
构造盆地
正多边形
计算机科学
班级(哲学)
配对
数学
算法
地质学
人工智能
物理
几何学
数学分析
量子力学
古生物学
集合(抽象数据类型)
超导电性
程序设计语言
作者
Ruoyu Sun,Tiantian Fang,Alexander G. Schwing
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:8
标识
DOI:10.48550/arxiv.2011.04926
摘要
Understanding of GAN training is still very limited. One major challenge is its non-convex-non-concave min-max objective, which may lead to sub-optimal local minima. In this work, we perform a global landscape analysis of the empirical loss of GANs. We prove that a class of separable-GAN, including the original JS-GAN, has exponentially many bad basins which are perceived as mode-collapse. We also study the relativistic pairing GAN (RpGAN) loss which couples the generated samples and the true samples. We prove that RpGAN has no bad basins. Experiments on synthetic data show that the predicted bad basin can indeed appear in training. We also perform experiments to support our theory that RpGAN has a better landscape than separable-GAN. For instance, we empirically show that RpGAN performs better than separable-GAN with relatively narrow neural nets. The code is available at https://github.com/AilsaF/RS-GAN.
科研通智能强力驱动
Strongly Powered by AbleSci AI