InfoStyler: Disentanglement Information Bottleneck for Artistic Style
Transfer
- URL: http://arxiv.org/abs/2307.16227v1
- Date: Sun, 30 Jul 2023 13:38:56 GMT
- Title: InfoStyler: Disentanglement Information Bottleneck for Artistic Style
Transfer
- Authors: Yueming Lyu, Yue Jiang, Bo Peng, Jing Dong
- Abstract summary: Artistic style transfer aims to transfer the style of an artwork to a photograph while maintaining its original overall content.
We propose a novel information disentanglement method, named InfoStyler, to capture the minimal sufficient information for both content and style representations.
- Score: 22.29381866838179
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Artistic style transfer aims to transfer the style of an artwork to a
photograph while maintaining its original overall content. Many prior works
focus on designing various transfer modules to transfer the style statistics to
the content image. Although effective, ignoring the clear disentanglement of
the content features and the style features from the first beginning, they have
difficulty in balancing between content preservation and style transferring. To
tackle this problem, we propose a novel information disentanglement method,
named InfoStyler, to capture the minimal sufficient information for both
content and style representations from the pre-trained encoding network.
InfoStyler formulates the disentanglement representation learning as an
information compression problem by eliminating style statistics from the
content image and removing the content structure from the style image. Besides,
to further facilitate disentanglement learning, a cross-domain Information
Bottleneck (IB) learning strategy is proposed by reconstructing the content and
style domains. Extensive experiments demonstrate that our InfoStyler can
synthesize high-quality stylized images while balancing content structure
preservation and style pattern richness.
Related papers
- InstantStyle-Plus: Style Transfer with Content-Preserving in Text-to-Image Generation [4.1177497612346]
Style transfer is an inventive process designed to create an image that maintains the essence of the original while embracing the visual style of another.
We introduce InstantStyle-Plus, an approach that prioritizes the integrity of the original content while seamlessly integrating the target style.
arXiv Detail & Related papers (2024-06-30T18:05:33Z) - Few-shot Image Generation via Style Adaptation and Content Preservation [60.08988307934977]
We introduce an image translation module to GAN transferring, where the module teaches the generator to separate style and content.
Our method consistently surpasses the state-of-the-art methods in few shot setting.
arXiv Detail & Related papers (2023-11-30T01:16:53Z) - TSSAT: Two-Stage Statistics-Aware Transformation for Artistic Style
Transfer [22.16475032434281]
Artistic style transfer aims to create new artistic images by rendering a given photograph with the target artistic style.
Existing methods learn styles simply based on global statistics or local patches, lacking careful consideration of the drawing process in practice.
We propose a Two-Stage Statistics-Aware Transformation (TSSAT) module, which first builds the global style foundation by aligning the global statistics of content and style features.
To further enhance both content and style representations, we introduce two novel losses: an attention-based content loss and a patch-based style loss.
arXiv Detail & Related papers (2023-09-12T07:02:13Z) - StyleAdapter: A Unified Stylized Image Generation Model [97.24936247688824]
StyleAdapter is a unified stylized image generation model capable of producing a variety of stylized images.
It can be integrated with existing controllable synthesis methods, such as T2I-adapter and ControlNet.
arXiv Detail & Related papers (2023-09-04T19:16:46Z) - StyleStegan: Leak-free Style Transfer Based on Feature Steganography [19.153040728118285]
existing style transfer methods suffer from a serious content leakage issue.
We propose a leak-free style transfer method based on feature steganography.
The results demonstrate that StyleStegan successfully mitigates the content leakage issue in serial and reversible style transfer tasks.
arXiv Detail & Related papers (2023-07-01T05:00:19Z) - Edge Enhanced Image Style Transfer via Transformers [6.666912981102909]
arbitrary image style transfer has attracted more and more attention.
It is difficult to simultaneously keep well the trade-off between the content details and the style features.
We present a new transformer-based method named STT for image style transfer and an edge loss.
arXiv Detail & Related papers (2023-01-02T10:39:31Z) - DiffStyler: Controllable Dual Diffusion for Text-Driven Image
Stylization [66.42741426640633]
DiffStyler is a dual diffusion processing architecture to control the balance between the content and style of diffused results.
We propose a content image-based learnable noise on which the reverse denoising process is based, enabling the stylization results to better preserve the structure information of the content image.
arXiv Detail & Related papers (2022-11-19T12:30:44Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Arbitrary Style Transfer via Multi-Adaptation Network [109.6765099732799]
A desired style transfer, given a content image and referenced style painting, would render the content image with the color tone and vivid stroke patterns of the style painting.
A new disentanglement loss function enables our network to extract main style patterns and exact content structures to adapt to various input images.
arXiv Detail & Related papers (2020-05-27T08:00:22Z) - A Content Transformation Block For Image Style Transfer [16.25958537802466]
This paper explicitly focuses on a content-and style-aware stylization of a content image.
We utilize similar content appearing in photographs and style samples to learn how style alters content details.
The robustness and speed of our model enables a video stylization in real-time and high definition.
arXiv Detail & Related papers (2020-03-18T18:00:23Z) - Parameter-Free Style Projection for Arbitrary Style Transfer [64.06126075460722]
This paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation.
This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer.
arXiv Detail & Related papers (2020-03-17T13:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.