ColoristaNet for Photorealistic Video Style Transfer
- URL: http://arxiv.org/abs/2212.09247v2
- Date: Wed, 21 Dec 2022 08:58:14 GMT
- Title: ColoristaNet for Photorealistic Video Style Transfer
- Authors: Xiaowen Qiu, Ruize Xu, Boan He, Yingtao Zhang, Wenqiang Zhang, Weifeng
Ge
- Abstract summary: Photorealistic style transfer aims to transfer the artistic style of an image onto an input image or video while keeping photorealism.
We propose a self-supervised style transfer framework, which contains a style removal part and a style restoration part.
Experiments demonstrate that ColoristaNet can achieve better stylization effects when compared with state-of-the-art algorithms.
- Score: 15.38024996795316
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Photorealistic style transfer aims to transfer the artistic style of an image
onto an input image or video while keeping photorealism. In this paper, we
think it's the summary statistics matching scheme in existing algorithms that
leads to unrealistic stylization. To avoid employing the popular Gram loss, we
propose a self-supervised style transfer framework, which contains a style
removal part and a style restoration part. The style removal network removes
the original image styles, and the style restoration network recovers image
styles in a supervised manner. Meanwhile, to address the problems in current
feature transformation methods, we propose decoupled instance normalization to
decompose feature transformation into style whitening and restylization. It
works quite well in ColoristaNet and can transfer image styles efficiently
while keeping photorealism. To ensure temporal coherency, we also incorporate
optical flow methods and ConvLSTM to embed contextual information. Experiments
demonstrates that ColoristaNet can achieve better stylization effects when
compared with state-of-the-art algorithms.
Related papers
- Bridging Text and Image for Artist Style Transfer via Contrastive Learning [21.962361974579036]
We propose a Contrastive Learning for Artistic Style Transfer (CLAST) to control arbitrary style transfer.
We introduce a supervised contrastive training strategy to effectively extract style descriptions from the image-text model.
We also propose a novel and efficient adaLN based state space models that explore style-content fusion.
arXiv Detail & Related papers (2024-10-12T15:27:57Z) - PixelShuffler: A Simple Image Translation Through Pixel Rearrangement [0.0]
Style transfer is a widely researched application of image-to-image translation, where the goal is to synthesize an image that combines the content of one image with the style of another.
Existing state-of-the-art methods often rely on complex neural networks, including diffusion models and language models, to achieve high-quality style transfer.
We propose a novel pixel shuffle method that addresses the image-to-image translation problem generally with a specific demonstrative application in style transfer.
arXiv Detail & Related papers (2024-10-03T22:08:41Z) - DiffStyler: Diffusion-based Localized Image Style Transfer [0.0]
Image style transfer aims to imbue digital imagery with the distinctive attributes of style targets, such as colors, brushstrokes, shapes.
Despite the advancements in arbitrary style transfer methods, a prevalent challenge remains the delicate equilibrium between content semantics and style attributes.
This paper introduces DiffStyler, a novel approach that facilitates efficient and precise arbitrary image style transfer.
arXiv Detail & Related papers (2024-03-27T11:19:34Z) - Portrait Diffusion: Training-free Face Stylization with
Chain-of-Painting [64.43760427752532]
Face stylization refers to the transformation of a face into a specific portrait style.
Current methods require the use of example-based adaptation approaches to fine-tune pre-trained generative models.
This paper proposes a training-free face stylization framework, named Portrait Diffusion.
arXiv Detail & Related papers (2023-12-03T06:48:35Z) - CCPL: Contrastive Coherence Preserving Loss for Versatile Style Transfer [58.020470877242865]
We devise a universally versatile style transfer method capable of performing artistic, photo-realistic, and video style transfer jointly.
We make a mild and reasonable assumption that global inconsistency is dominated by local inconsistencies and devise a generic Contrastive Coherence Preserving Loss (CCPL) applied to local patches.
CCPL can preserve the coherence of the content source during style transfer without degrading stylization.
arXiv Detail & Related papers (2022-07-11T12:09:41Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - UMFA: A photorealistic style transfer method based on U-Net and
multi-layer feature aggregation [0.0]
We propose a photorealistic style transfer network to emphasize the natural effect of photorealistic image stylization.
In particular, an encoder based on the dense block and a decoder form a symmetrical structure of U-Net are jointly staked to realize an effective feature extraction and image reconstruction.
arXiv Detail & Related papers (2021-08-13T08:06:29Z) - CAMS: Color-Aware Multi-Style Transfer [46.550390398057985]
Style transfer aims to manipulate the appearance of a source image, or "content" image, to share similar texture and colors of a target "style" image.
A commonly used approach to assist in transferring styles is based on Gram matrix optimization.
We propose a color-aware multi-style transfer method that generates aesthetically pleasing results while preserving the style-color correlation between style and generated images.
arXiv Detail & Related papers (2021-06-26T01:15:09Z) - Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality
Artistic Style Transfer [115.13853805292679]
Artistic style transfer aims at migrating the style from an example image to a content image.
Inspired by the common painting process of drawing a draft and revising the details, we introduce a novel feed-forward method named Laplacian Pyramid Network (LapStyle)
Our method can synthesize high quality stylized images in real time, where holistic style patterns are properly transferred.
arXiv Detail & Related papers (2021-04-12T11:53:53Z) - Parameter-Free Style Projection for Arbitrary Style Transfer [64.06126075460722]
This paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation.
This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer.
arXiv Detail & Related papers (2020-03-17T13:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.