Interactive Style Transfer: All is Your Palette
- URL: http://arxiv.org/abs/2203.13470v1
- Date: Fri, 25 Mar 2022 06:38:46 GMT
- Title: Interactive Style Transfer: All is Your Palette
- Authors: Zheng Lin, Zhao Zhang, Kang-Rui Zhang, Bo Ren, Ming-Ming Cheng
- Abstract summary: We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
- Score: 74.06681967115594
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural style transfer (NST) can create impressive artworks by transferring
reference style to content image. Current image-to-image NST methods are short
of fine-grained controls, which are often demanded by artistic editing. To
mitigate this limitation, we propose a drawing-like interactive style transfer
(IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to
any region of the target content image. To determine the action scope, we
formulate a fluid simulation algorithm, which takes styles as pigments around
the position of brush interaction, and diffusion in style or content images
according to the similarity maps. Our IST method expands the creative dimension
of NST. By dipping and painting, even employing one style image can produce
thousands of eye-catching works. The demo video is available in supplementary
files or in http://mmcheng.net/ist.
Related papers
- Bridging Text and Image for Artist Style Transfer via Contrastive Learning [21.962361974579036]
We propose a Contrastive Learning for Artistic Style Transfer (CLAST) to control arbitrary style transfer.
We introduce a supervised contrastive training strategy to effectively extract style descriptions from the image-text model.
We also propose a novel and efficient adaLN based state space models that explore style-content fusion.
arXiv Detail & Related papers (2024-10-12T15:27:57Z) - Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region [66.75826549444909]
Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
arXiv Detail & Related papers (2023-09-07T06:27:39Z) - Any-to-Any Style Transfer: Making Picasso and Da Vinci Collaborate [58.83278629019384]
Style transfer aims to render the style of a given image for style reference to another given image for content reference.
Existing approaches either apply the holistic style of the style image in a global manner, or migrate local colors and textures of the style image to the content counterparts in a pre-defined way.
We propose Any-to-Any Style Transfer, which enables users to interactively select styles of regions in the style image and apply them to the prescribed content regions.
arXiv Detail & Related papers (2023-04-19T15:15:36Z) - Scaling Painting Style Transfer [10.059627473725508]
Neural style transfer (NST) is a technique that produces an unprecedentedly rich style transfer from a style image to a content image.
This paper presents a solution to solve the original global optimization for ultra-high resolution (UHR) images.
We show that our method produces style transfer of unmatched quality for such high-resolution painting styles.
arXiv Detail & Related papers (2022-12-27T12:03:38Z) - QuantArt: Quantizing Image Style Transfer Towards High Visual Fidelity [94.5479418998225]
We propose a new style transfer framework called QuantArt for high visual-fidelity stylization.
Our framework achieves significantly higher visual fidelity compared with the existing style transfer methods.
arXiv Detail & Related papers (2022-12-20T17:09:53Z) - Inversion-Based Style Transfer with Diffusion Models [78.93863016223858]
Previous arbitrary example-guided artistic image generation methods often fail to control shape changes or convey elements.
We propose an inversion-based style transfer method (InST), which can efficiently and accurately learn the key information of an image.
arXiv Detail & Related papers (2022-11-23T18:44:25Z) - Learning Diverse Tone Styles for Image Retouching [73.60013618215328]
We propose to learn diverse image retouching with normalizing flow-based architectures.
A joint-training pipeline is composed of a style encoder, a conditional RetouchNet, and the image tone style normalizing flow (TSFlow) module.
Our proposed method performs favorably against state-of-the-art methods and is effective in generating diverse results.
arXiv Detail & Related papers (2022-07-12T09:49:21Z) - Name Your Style: An Arbitrary Artist-aware Image Style Transfer [38.41608300670523]
We propose a text-driven image style transfer (TxST) that leverages advanced image-text encoders to control arbitrary style transfer.
We introduce a contrastive training strategy to effectively extract style descriptions from the image-text model.
We also propose a novel and efficient attention module that explores cross-attentions to fuse style and content features.
arXiv Detail & Related papers (2022-02-28T06:21:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.