计算机科学
人工智能
图像(数学)
计算机视觉
风格(视觉艺术)
模式识别(心理学)
基质(化学分析)
材料科学
艺术
文学类
复合材料
作者
Longqing Zhang,Zishang Wang,Jinwen He,Yixuan Li
标识
DOI:10.1109/icaica58456.2023.10405398
摘要
Using CNN for style transfer is a powerful image processing technique. The goal of style transfer is to apply the style of one image to another image, creating a new image that retains the content of the original image but has a style similar to the other image. This is known as style transfer. The process begins by defining preprocessing and postprocessing functions for the images. Then, the VGG-19 model is used to extract image features, and the synthesized image is continuously updated by minimizing the loss function as model parameters. The loss function used in the experiment is a weighted combination of content loss, style loss, and total variation loss, which balances the relative importance of preserving content, transferring style, and reducing noise in the synthesized image. The style is expressed using the Gram matrix to represent the style layer outputs. The results indicate that the synthesized image retains the scenery and objects from the content image while simultaneously transferring the color palette from the style image.
科研通智能强力驱动
Strongly Powered by AbleSci AI