StyleStegan: Leak-free Style Transfer Based on Feature Steganography
- URL: http://arxiv.org/abs/2307.00225v1
- Date: Sat, 1 Jul 2023 05:00:19 GMT
- Title: StyleStegan: Leak-free Style Transfer Based on Feature Steganography
- Authors: Xiujian Liang, Bingshan Liu, Qichao Ying, Zhenxing Qian and Xinpeng
Zhang
- Abstract summary: existing style transfer methods suffer from a serious content leakage issue.
We propose a leak-free style transfer method based on feature steganography.
The results demonstrate that StyleStegan successfully mitigates the content leakage issue in serial and reversible style transfer tasks.
- Score: 19.153040728118285
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In modern social networks, existing style transfer methods suffer from a
serious content leakage issue, which hampers the ability to achieve serial and
reversible stylization, thereby hindering the further propagation of stylized
images in social networks. To address this problem, we propose a leak-free
style transfer method based on feature steganography. Our method consists of
two main components: a style transfer method that accomplishes artistic
stylization on the original image and an image steganography method that embeds
content feature secrets on the stylized image. The main contributions of our
work are as follows: 1) We identify and explain the phenomenon of content
leakage and its underlying causes, which arise from content inconsistencies
between the original image and its subsequent stylized image. 2) We design a
neural flow model for achieving loss-free and biased-free style transfer. 3) We
introduce steganography to hide content feature information on the stylized
image and control the subsequent usage rights. 4) We conduct comprehensive
experimental validation using publicly available datasets MS-COCO and Wikiart.
The results demonstrate that StyleStegan successfully mitigates the content
leakage issue in serial and reversible style transfer tasks. The SSIM
performance metrics for these tasks are 14.98% and 7.28% higher, respectively,
compared to a suboptimal baseline model.
Related papers
- FAGStyle: Feature Augmentation on Geodesic Surface for Zero-shot Text-guided Diffusion Image Style Transfer [2.3293561091456283]
The goal of image style transfer is to render an image guided by a style reference while maintaining the original content.
We introduce FAGStyle, a zero-shot text-guided diffusion image style transfer method.
Our approach enhances inter-patch information interaction by incorporating the Sliding Window Crop technique.
arXiv Detail & Related papers (2024-08-20T04:20:11Z) - ZePo: Zero-Shot Portrait Stylization with Faster Sampling [61.14140480095604]
This paper presents an inversion-free portrait stylization framework based on diffusion models that accomplishes content and style feature fusion in merely four sampling steps.
We propose a feature merging strategy to amalgamate redundant features in Consistency Features, thereby reducing the computational load of attention control.
arXiv Detail & Related papers (2024-08-10T08:53:41Z) - D2Styler: Advancing Arbitrary Style Transfer with Discrete Diffusion Methods [2.468658581089448]
We propose a novel framework called D$2$Styler (Discrete Diffusion Styler)
Our method uses Adaptive Instance Normalization (AdaIN) features as a context guide for the reverse diffusion process.
Experimental results demonstrate that D$2$Styler produces high-quality style-transferred images.
arXiv Detail & Related papers (2024-08-07T05:47:06Z) - InstantStyle-Plus: Style Transfer with Content-Preserving in Text-to-Image Generation [4.1177497612346]
Style transfer is an inventive process designed to create an image that maintains the essence of the original while embracing the visual style of another.
We introduce InstantStyle-Plus, an approach that prioritizes the integrity of the original content while seamlessly integrating the target style.
arXiv Detail & Related papers (2024-06-30T18:05:33Z) - Portrait Diffusion: Training-free Face Stylization with
Chain-of-Painting [64.43760427752532]
Face stylization refers to the transformation of a face into a specific portrait style.
Current methods require the use of example-based adaptation approaches to fine-tune pre-trained generative models.
This paper proposes a training-free face stylization framework, named Portrait Diffusion.
arXiv Detail & Related papers (2023-12-03T06:48:35Z) - InfoStyler: Disentanglement Information Bottleneck for Artistic Style
Transfer [22.29381866838179]
Artistic style transfer aims to transfer the style of an artwork to a photograph while maintaining its original overall content.
We propose a novel information disentanglement method, named InfoStyler, to capture the minimal sufficient information for both content and style representations.
arXiv Detail & Related papers (2023-07-30T13:38:56Z) - DiffStyler: Controllable Dual Diffusion for Text-Driven Image
Stylization [66.42741426640633]
DiffStyler is a dual diffusion processing architecture to control the balance between the content and style of diffused results.
We propose a content image-based learnable noise on which the reverse denoising process is based, enabling the stylization results to better preserve the structure information of the content image.
arXiv Detail & Related papers (2022-11-19T12:30:44Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows [101.16791104543492]
ArtFlow is proposed to prevent content leak during universal style transfer.
It supports both forward and backward inferences and operates in a projection-transfer-reversion scheme.
It achieves comparable performance to state-of-the-art style transfer methods while avoiding content leak.
arXiv Detail & Related papers (2021-03-31T07:59:02Z) - Parameter-Free Style Projection for Arbitrary Style Transfer [64.06126075460722]
This paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation.
This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer.
arXiv Detail & Related papers (2020-03-17T13:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.