Using CNN for style transfer is a powerful image processing technique. The goal of style transfer is to apply the style of one image to another image, creating a new image that retains the content of the original image but has a style similar to the other image. This is known as style transfer. The process begins by defining preprocessing and postprocessing functions for the images. Then, the VGG-19 model is used to extract image features, and the synthesized image is continuously updated by minimizing the loss function as model parameters. The loss function used in the experiment is a weighted combination of content loss, style loss, and total variation loss, which balances the relative importance of preserving content, transferring style, and reducing noise in the synthesized image. The style is expressed using the Gram matrix to represent the style layer outputs. The results indicate that the synthesized image retains the scenery and objects from the content image while simultaneously transferring the color palette from the style image.