TSSAT: Two-Stage Statistics-Aware Transformation for Artistic Style
Transfer
- URL: http://arxiv.org/abs/2309.06004v1
- Date: Tue, 12 Sep 2023 07:02:13 GMT
- Title: TSSAT: Two-Stage Statistics-Aware Transformation for Artistic Style
Transfer
- Authors: Haibo Chen, Lei Zhao, Jun Li, and Jian Yang
- Abstract summary: Artistic style transfer aims to create new artistic images by rendering a given photograph with the target artistic style.
Existing methods learn styles simply based on global statistics or local patches, lacking careful consideration of the drawing process in practice.
We propose a Two-Stage Statistics-Aware Transformation (TSSAT) module, which first builds the global style foundation by aligning the global statistics of content and style features.
To further enhance both content and style representations, we introduce two novel losses: an attention-based content loss and a patch-based style loss.
- Score: 22.16475032434281
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Artistic style transfer aims to create new artistic images by rendering a
given photograph with the target artistic style. Existing methods learn styles
simply based on global statistics or local patches, lacking careful
consideration of the drawing process in practice. Consequently, the stylization
results either fail to capture abundant and diversified local style patterns,
or contain undesired semantic information of the style image and deviate from
the global style distribution. To address this issue, we imitate the drawing
process of humans and propose a Two-Stage Statistics-Aware Transformation
(TSSAT) module, which first builds the global style foundation by aligning the
global statistics of content and style features and then further enriches local
style details by swapping the local statistics (instead of local features) in a
patch-wise manner, significantly improving the stylization effects. Moreover,
to further enhance both content and style representations, we introduce two
novel losses: an attention-based content loss and a patch-based style loss,
where the former enables better content preservation by enforcing the semantic
relation in the content image to be retained during stylization, and the latter
focuses on increasing the local style similarity between the style and stylized
images. Extensive qualitative and quantitative experiments verify the
effectiveness of our method.
Related papers
- Multiscale style transfer based on a Laplacian pyramid for traditional Chinese painting [6.248530911794617]
We present a novel effective multiscale style transfer method based on Laplacian pyramid decomposition and reconstruction.
In the first stage, the holistic patterns are transferred at low resolution by adopting a Style Transfer Base Network.
The details of the content and style are gradually enhanced at higher resolutions by a Detail Enhancement Network.
arXiv Detail & Related papers (2025-02-07T01:04:49Z) - Z-STAR+: A Zero-shot Style Transfer Method via Adjusting Style Distribution [24.88532732093652]
Style transfer presents a significant challenge, primarily centered on identifying an appropriate style representation.
In contrast to existing approaches, we have discovered that latent features in vanilla diffusion models inherently contain natural style and content distributions.
Our method adopts dual denoising paths to represent content and style references in latent space, subsequently guiding the content image denoising process with style latent codes.
arXiv Detail & Related papers (2024-11-28T15:56:17Z) - InfoStyler: Disentanglement Information Bottleneck for Artistic Style
Transfer [22.29381866838179]
Artistic style transfer aims to transfer the style of an artwork to a photograph while maintaining its original overall content.
We propose a novel information disentanglement method, named InfoStyler, to capture the minimal sufficient information for both content and style representations.
arXiv Detail & Related papers (2023-07-30T13:38:56Z) - ALADIN-NST: Self-supervised disentangled representation learning of
artistic style through Neural Style Transfer [60.6863849241972]
We learn a representation of visual artistic style more strongly disentangled from the semantic content depicted in an image.
We show that strongly addressing the disentanglement of style and content leads to large gains in style-specific metrics.
arXiv Detail & Related papers (2023-04-12T10:33:18Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Arbitrary Style Transfer with Structure Enhancement by Combining the
Global and Local Loss [51.309905690367835]
We introduce a novel arbitrary style transfer method with structure enhancement by combining the global and local loss.
Experimental results demonstrate that our method can generate higher-quality images with impressive visual effects.
arXiv Detail & Related papers (2022-07-23T07:02:57Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Arbitrary Style Transfer via Multi-Adaptation Network [109.6765099732799]
A desired style transfer, given a content image and referenced style painting, would render the content image with the color tone and vivid stroke patterns of the style painting.
A new disentanglement loss function enables our network to extract main style patterns and exact content structures to adapt to various input images.
arXiv Detail & Related papers (2020-05-27T08:00:22Z) - Parameter-Free Style Projection for Arbitrary Style Transfer [64.06126075460722]
This paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation.
This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer.
arXiv Detail & Related papers (2020-03-17T13:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.