ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows
- URL: http://arxiv.org/abs/2103.16877v1
- Date: Wed, 31 Mar 2021 07:59:02 GMT
- Title: ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows
- Authors: Jie An, Siyu Huang, Yibing Song, Dejing Dou, Wei Liu, Jiebo Luo
- Abstract summary: ArtFlow is proposed to prevent content leak during universal style transfer.
It supports both forward and backward inferences and operates in a projection-transfer-reversion scheme.
It achieves comparable performance to state-of-the-art style transfer methods while avoiding content leak.
- Score: 101.16791104543492
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Universal style transfer retains styles from reference images in content
images. While existing methods have achieved state-of-the-art style transfer
performance, they are not aware of the content leak phenomenon that the image
content may corrupt after several rounds of stylization process. In this paper,
we propose ArtFlow to prevent content leak during universal style transfer.
ArtFlow consists of reversible neural flows and an unbiased feature transfer
module. It supports both forward and backward inferences and operates in a
projection-transfer-reversion scheme. The forward inference projects input
images into deep features, while the backward inference remaps deep features
back to input images in a lossless and unbiased way. Extensive experiments
demonstrate that ArtFlow achieves comparable performance to state-of-the-art
style transfer methods while avoiding content leak.
Related papers
- D2Styler: Advancing Arbitrary Style Transfer with Discrete Diffusion Methods [2.468658581089448]
We propose a novel framework called D$2$Styler (Discrete Diffusion Styler)
Our method uses Adaptive Instance Normalization (AdaIN) features as a context guide for the reverse diffusion process.
Experimental results demonstrate that D$2$Styler produces high-quality style-transferred images.
arXiv Detail & Related papers (2024-08-07T05:47:06Z) - StyleStegan: Leak-free Style Transfer Based on Feature Steganography [19.153040728118285]
existing style transfer methods suffer from a serious content leakage issue.
We propose a leak-free style transfer method based on feature steganography.
The results demonstrate that StyleStegan successfully mitigates the content leakage issue in serial and reversible style transfer tasks.
arXiv Detail & Related papers (2023-07-01T05:00:19Z) - Edge Enhanced Image Style Transfer via Transformers [6.666912981102909]
arbitrary image style transfer has attracted more and more attention.
It is difficult to simultaneously keep well the trade-off between the content details and the style features.
We present a new transformer-based method named STT for image style transfer and an edge loss.
arXiv Detail & Related papers (2023-01-02T10:39:31Z) - QuantArt: Quantizing Image Style Transfer Towards High Visual Fidelity [94.5479418998225]
We propose a new style transfer framework called QuantArt for high visual-fidelity stylization.
Our framework achieves significantly higher visual fidelity compared with the existing style transfer methods.
arXiv Detail & Related papers (2022-12-20T17:09:53Z) - StyleFlow: Disentangle Latent Representations via Normalizing Flow for
Unsupervised Text Style Transfer [5.439842512864442]
Style transfer aims to alter the style of a sentence while preserving its content.
In this paper, we propose a novel disentanglement-based style transfer model StyleFlow to enhance content preservation.
arXiv Detail & Related papers (2022-12-19T17:59:18Z) - DiffStyler: Controllable Dual Diffusion for Text-Driven Image
Stylization [66.42741426640633]
DiffStyler is a dual diffusion processing architecture to control the balance between the content and style of diffused results.
We propose a content image-based learnable noise on which the reverse denoising process is based, enabling the stylization results to better preserve the structure information of the content image.
arXiv Detail & Related papers (2022-11-19T12:30:44Z) - Learning Diverse Tone Styles for Image Retouching [73.60013618215328]
We propose to learn diverse image retouching with normalizing flow-based architectures.
A joint-training pipeline is composed of a style encoder, a conditional RetouchNet, and the image tone style normalizing flow (TSFlow) module.
Our proposed method performs favorably against state-of-the-art methods and is effective in generating diverse results.
arXiv Detail & Related papers (2022-07-12T09:49:21Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - StyTr^2: Unbiased Image Style Transfer with Transformers [59.34108877969477]
The goal of image style transfer is to render an image with artistic features guided by a style reference while maintaining the original content.
Traditional neural style transfer methods are usually biased and content leak can be observed by running several times of the style transfer process with the same reference image.
We propose a transformer-based approach, namely StyTr2, to address this critical issue.
arXiv Detail & Related papers (2021-05-30T15:57:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.