计算机科学
排列(音乐)
图像(数学)
风格(视觉艺术)
人工神经网络
人工智能
代表(政治)
政治学
声学
政治
历史
物理
考古
法学
作者
Zhentan Zheng,Jianyi Liu,Nanning Zheng
标识
DOI:10.1109/tmm.2022.3203220
摘要
Style transfer is a useful image synthesis technique that can re-render given image into another artistic style while preserving its content information. Generative Adversarial Network (GAN) is a widely adopted framework toward this task for its better representation ability on local style patterns than the traditional Gram-matrix based methods. However, most previous methods rely on sufficient amount of pre-collected style images to train the model. In this paper, a novel Patch Permutation GAN (P $^{2}$ -GAN) network that can efficiently learn the stroke style from a single style image is proposed. We use patch permutation to generate multiple training samples from the given style image. A patch discriminator that can simultaneously process patch-wise images and natural images seamlessly is designed. We also propose a local texture descriptor based criterion to quantitatively evaluate the style transfer quality. Experimental results showed that our method can produce finer quality re-renderings from single style image with improved computational efficiency compared with many state-of-the-arts methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI