CAMS: Color-Aware Multi-Style Transfer
- URL: http://arxiv.org/abs/2106.13920v1
- Date: Sat, 26 Jun 2021 01:15:09 GMT
- Title: CAMS: Color-Aware Multi-Style Transfer
- Authors: Mahmoud Afifi, Abdullah Abuolaim, Mostafa Hussien, Marcus A. Brubaker,
Michael S. Brown
- Abstract summary: Style transfer aims to manipulate the appearance of a source image, or "content" image, to share similar texture and colors of a target "style" image.
A commonly used approach to assist in transferring styles is based on Gram matrix optimization.
We propose a color-aware multi-style transfer method that generates aesthetically pleasing results while preserving the style-color correlation between style and generated images.
- Score: 46.550390398057985
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image style transfer aims to manipulate the appearance of a source image, or
"content" image, to share similar texture and colors of a target "style" image.
Ideally, the style transfer manipulation should also preserve the semantic
content of the source image. A commonly used approach to assist in transferring
styles is based on Gram matrix optimization. One problem of Gram matrix-based
optimization is that it does not consider the correlation between colors and
their styles. Specifically, certain textures or structures should be associated
with specific colors. This is particularly challenging when the target style
image exhibits multiple style types. In this work, we propose a color-aware
multi-style transfer method that generates aesthetically pleasing results while
preserving the style-color correlation between style and generated images. We
achieve this desired outcome by introducing a simple but efficient modification
to classic Gram matrix-based style transfer optimization. A nice feature of our
method is that it enables the users to manually select the color associations
between the target style and content image for more transfer flexibility. We
validated our method with several qualitative comparisons, including a user
study conducted with 30 participants. In comparison with prior work, our method
is simple, easy to implement, and achieves visually appealing results when
targeting images that have multiple styles. Source code is available at
https://github.com/mahmoudnafifi/color-aware-style-transfer.
Related papers
- NCST: Neural-based Color Style Transfer for Video Retouching [3.2050418539021774]
Video color style transfer aims to transform the color style of an original video by using a reference style image.
Most existing methods employ neural networks, which come with challenges like opaque transfer processes.
We introduce a method that predicts specific parameters for color style transfer using two images.
arXiv Detail & Related papers (2024-11-01T03:25:15Z) - MRStyle: A Unified Framework for Color Style Transfer with Multi-Modality Reference [32.64957647390327]
We introduce MRStyle, a framework that enables color style transfer using multi-modality reference, including image and text.
For text reference, we align the text feature of stable diffusion priors with the style feature of our IRStyle to perform text-guided color style transfer (TRStyle)
Our TRStyle method is highly efficient in both training and inference, producing notable open-set text-guided transfer results.
arXiv Detail & Related papers (2024-09-09T00:01:48Z) - Any-to-Any Style Transfer: Making Picasso and Da Vinci Collaborate [58.83278629019384]
Style transfer aims to render the style of a given image for style reference to another given image for content reference.
Existing approaches either apply the holistic style of the style image in a global manner, or migrate local colors and textures of the style image to the content counterparts in a pre-defined way.
We propose Any-to-Any Style Transfer, which enables users to interactively select styles of regions in the style image and apply them to the prescribed content regions.
arXiv Detail & Related papers (2023-04-19T15:15:36Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - DiffStyler: Controllable Dual Diffusion for Text-Driven Image
Stylization [66.42741426640633]
DiffStyler is a dual diffusion processing architecture to control the balance between the content and style of diffused results.
We propose a content image-based learnable noise on which the reverse denoising process is based, enabling the stylization results to better preserve the structure information of the content image.
arXiv Detail & Related papers (2022-11-19T12:30:44Z) - Learning Diverse Tone Styles for Image Retouching [73.60013618215328]
We propose to learn diverse image retouching with normalizing flow-based architectures.
A joint-training pipeline is composed of a style encoder, a conditional RetouchNet, and the image tone style normalizing flow (TSFlow) module.
Our proposed method performs favorably against state-of-the-art methods and is effective in generating diverse results.
arXiv Detail & Related papers (2022-07-12T09:49:21Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - P$^2$-GAN: Efficient Style Transfer Using Single Style Image [2.703193151632043]
Style transfer is a useful image synthesis technique that can re-render given image into another artistic style.
Generative Adversarial Network (GAN) is a widely adopted framework toward this task for its better representation ability on local style patterns.
In this paper, a novel Patch Permutation GAN (P$2$-GAN) network that can efficiently learn the stroke style from a single style image is proposed.
arXiv Detail & Related papers (2020-01-21T12:08:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.