李普希茨连续性
图像(数学)
可逆矩阵
翻译(生物学)
计算机科学
算法
人工神经网络
图像翻译
数学
人工智能
理论计算机科学
模式识别(心理学)
纯数学
基因
信使核糖核酸
生物化学
化学
作者
Longquan Dai,Jinhui Tang
标识
DOI:10.1109/tpami.2021.3062849
摘要
We propose iFlowGAN that learns an invertible flow (a sequence of invertible mappings) via adversarial learning and exploit it to transform a source distribution into a target distribution for unsupervised image-to-image translation. Existing GAN-based generative model such as CycleGAN [1], StarGAN [2], AGGAN [3] and CyCADA [4] needs to learn a highly under-constraint forward mapping F: X → Y from a source domain X to a target domain Y. Researchers do this by assuming there is a backward mapping B: Y → X such that x and y are fixed points of the composite functions B °F and F °B. Inspired by zero-order reverse filtering [5], we (1) understand F via contraction mappings on a metric space; (2) provide a simple yet effective algorithm to present B via the parameters of F in light of Banach fixed point theorem; (3) provide a Lipschitz-regularized network which indicates a general approach to compose the inverse for arbitrary Lipschitz-regularized networks via Banach fixed point theorem. This network is useful for image-to-image translation tasks because it could save the memory for the weights of B. Although memory can also be saved by directly coupling the weights of the forward and backward mappings, the performance of the image-to-image translation network degrades significantly. This explains why current GAN-based generative models including CycleGAN must take different parameters to compose the forward and backward mappings instead of employing the same weights to build both mappings. Taking advantage of the Lipschitz-regularized network, we not only build iFlowGAN to solve the redundancy shortcoming of CycleGAN but also assemble the corresponding iFlowGAN versions of StarGAN, AGGAN and CyCADA without breaking their network architectures. Extensive experiments show that the iFlowGAN version could produce comparable results of the original implementation while saving half parameters.
科研通智能强力驱动
Strongly Powered by AbleSci AI