AesPA-Net: Aesthetic Pattern-Aware Style Transfer Networks
- URL: http://arxiv.org/abs/2307.09724v3
- Date: Tue, 8 Aug 2023 13:14:26 GMT
- Title: AesPA-Net: Aesthetic Pattern-Aware Style Transfer Networks
- Authors: Kibeom Hong, Seogkyu Jeon, Junsoo Lee, Namhyuk Ahn, Kunhee Kim,
Pilhyeon Lee, Daesik Kim, Youngjung Uh, Hyeran Byun
- Abstract summary: We focus on enhancing the attention mechanism and capturing the rhythm of patterns that organize the style.
Based on the pattern repeatability, we propose Aesthetic Pattern-Aware style transfer Networks (AesPA-Net)
In addition, we propose a novel self-supervisory task to encourage the attention mechanism to learn precise and meaningful semantic correspondence.
- Score: 28.136463099603564
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: To deliver the artistic expression of the target style, recent studies
exploit the attention mechanism owing to its ability to map the local patches
of the style image to the corresponding patches of the content image. However,
because of the low semantic correspondence between arbitrary content and
artworks, the attention module repeatedly abuses specific local patches from
the style image, resulting in disharmonious and evident repetitive artifacts.
To overcome this limitation and accomplish impeccable artistic style transfer,
we focus on enhancing the attention mechanism and capturing the rhythm of
patterns that organize the style. In this paper, we introduce a novel metric,
namely pattern repeatability, that quantifies the repetition of patterns in the
style image. Based on the pattern repeatability, we propose Aesthetic
Pattern-Aware style transfer Networks (AesPA-Net) that discover the sweet spot
of local and global style expressions. In addition, we propose a novel
self-supervisory task to encourage the attention mechanism to learn precise and
meaningful semantic correspondence. Lastly, we introduce the patch-wise style
loss to transfer the elaborate rhythm of local patterns. Through qualitative
and quantitative evaluations, we verify the reliability of the proposed pattern
repeatability that aligns with human perception, and demonstrate the
superiority of the proposed framework.
Related papers
- Locally Stylized Neural Radiance Fields [30.037649804991315]
We propose a stylization framework for neural radiance fields (NeRF) based on local style transfer.
In particular, we use a hash-grid encoding to learn the embedding of the appearance and geometry components.
We show that our method yields plausible stylization results with novel view synthesis.
arXiv Detail & Related papers (2023-09-19T15:08:10Z) - TSSAT: Two-Stage Statistics-Aware Transformation for Artistic Style
Transfer [22.16475032434281]
Artistic style transfer aims to create new artistic images by rendering a given photograph with the target artistic style.
Existing methods learn styles simply based on global statistics or local patches, lacking careful consideration of the drawing process in practice.
We propose a Two-Stage Statistics-Aware Transformation (TSSAT) module, which first builds the global style foundation by aligning the global statistics of content and style features.
To further enhance both content and style representations, we introduce two novel losses: an attention-based content loss and a patch-based style loss.
arXiv Detail & Related papers (2023-09-12T07:02:13Z) - ALADIN-NST: Self-supervised disentangled representation learning of
artistic style through Neural Style Transfer [60.6863849241972]
We learn a representation of visual artistic style more strongly disentangled from the semantic content depicted in an image.
We show that strongly addressing the disentanglement of style and content leads to large gains in style-specific metrics.
arXiv Detail & Related papers (2023-04-12T10:33:18Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - All-to-key Attention for Arbitrary Style Transfer [98.83954812536521]
We propose a novel all-to-key attention mechanism -- each position of content features is matched to stable key positions of style features.
The resultant module, dubbed StyA2K, shows extraordinary performance in preserving the semantic structure and rendering consistent style patterns.
arXiv Detail & Related papers (2022-12-08T06:46:35Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - SimAN: Exploring Self-Supervised Representation Learning of Scene Text
via Similarity-Aware Normalization [66.35116147275568]
Self-supervised representation learning has drawn considerable attention from the scene text recognition community.
We tackle the issue by formulating the representation learning scheme in a generative manner.
We propose a Similarity-Aware Normalization (SimAN) module to identify the different patterns and align the corresponding styles from the guiding patch.
arXiv Detail & Related papers (2022-03-20T08:43:10Z) - Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality
Artistic Style Transfer [115.13853805292679]
Artistic style transfer aims at migrating the style from an example image to a content image.
Inspired by the common painting process of drawing a draft and revising the details, we introduce a novel feed-forward method named Laplacian Pyramid Network (LapStyle)
Our method can synthesize high quality stylized images in real time, where holistic style patterns are properly transferred.
arXiv Detail & Related papers (2021-04-12T11:53:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.