AesUST: Towards Aesthetic-Enhanced Universal Style Transfer
- URL: http://arxiv.org/abs/2208.13016v1
- Date: Sat, 27 Aug 2022 13:51:11 GMT
- Title: AesUST: Towards Aesthetic-Enhanced Universal Style Transfer
- Authors: Zhizhong Wang, Zhanjie Zhang, Lei Zhao, Zhiwen Zuo, Ailin Li, Wei
Xing, Dongming Lu
- Abstract summary: AesUST is a novel Aesthetic-enhanced Universal Style Transfer approach.
We introduce an aesthetic discriminator to learn the universal human-delightful aesthetic features from a large corpus of artist-created paintings.
We also develop a new two-stage transfer training strategy with two aesthetic regularizations to train our model more effectively.
- Score: 15.078430702469886
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent studies have shown remarkable success in universal style transfer
which transfers arbitrary visual styles to content images. However, existing
approaches suffer from the aesthetic-unrealistic problem that introduces
disharmonious patterns and evident artifacts, making the results easy to spot
from real paintings. To address this limitation, we propose AesUST, a novel
Aesthetic-enhanced Universal Style Transfer approach that can generate
aesthetically more realistic and pleasing results for arbitrary styles.
Specifically, our approach introduces an aesthetic discriminator to learn the
universal human-delightful aesthetic features from a large corpus of
artist-created paintings. Then, the aesthetic features are incorporated to
enhance the style transfer process via a novel Aesthetic-aware Style-Attention
(AesSA) module. Such an AesSA module enables our AesUST to efficiently and
flexibly integrate the style patterns according to the global aesthetic channel
distribution of the style image and the local semantic spatial distribution of
the content image. Moreover, we also develop a new two-stage transfer training
strategy with two aesthetic regularizations to train our model more
effectively, further improving stylization performance. Extensive experiments
and user studies demonstrate that our approach synthesizes aesthetically more
harmonious and realistic results than state of the art, greatly narrowing the
disparity with real artist-created paintings. Our code is available at
https://github.com/EndyWon/AesUST.
Related papers
- Towards Highly Realistic Artistic Style Transfer via Stable Diffusion with Step-aware and Layer-aware Prompt [12.27693060663517]
Artistic style transfer aims to transfer the learned artistic style onto an arbitrary content image, generating artistic stylized images.
We propose a novel pre-trained diffusion-based artistic style transfer method, called LSAST.
Our proposed method can generate more highly realistic artistic stylized images than the state-of-the-art artistic style transfer methods.
arXiv Detail & Related papers (2024-04-17T15:28:53Z) - CreativeSynth: Creative Blending and Synthesis of Visual Arts based on
Multimodal Diffusion [74.44273919041912]
Large-scale text-to-image generative models have made impressive strides, showcasing their ability to synthesize a vast array of high-quality images.
However, adapting these models for artistic image editing presents two significant challenges.
We build the innovative unified framework Creative Synth, which is based on a diffusion model with the ability to coordinate multimodal inputs.
arXiv Detail & Related papers (2024-01-25T10:42:09Z) - HiCAST: Highly Customized Arbitrary Style Transfer with Adapter Enhanced
Diffusion Models [84.12784265734238]
The goal of Arbitrary Style Transfer (AST) is injecting the artistic features of a style reference into a given image/video.
We propose HiCAST, which is capable of explicitly customizing the stylization results according to various source of semantic clues.
A novel learning objective is leveraged for video diffusion model training, which significantly improve cross-frame temporal consistency.
arXiv Detail & Related papers (2024-01-11T12:26:23Z) - ArtBank: Artistic Style Transfer with Pre-trained Diffusion Model and
Implicit Style Prompt Bank [9.99530386586636]
Artistic style transfer aims to repaint the content image with the learned artistic style.
Existing artistic style transfer methods can be divided into two categories: small model-based approaches and pre-trained large-scale model-based approaches.
We propose ArtBank, a novel artistic style transfer framework, to generate highly realistic stylized images.
arXiv Detail & Related papers (2023-12-11T05:53:40Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - ALADIN-NST: Self-supervised disentangled representation learning of
artistic style through Neural Style Transfer [60.6863849241972]
We learn a representation of visual artistic style more strongly disentangled from the semantic content depicted in an image.
We show that strongly addressing the disentanglement of style and content leads to large gains in style-specific metrics.
arXiv Detail & Related papers (2023-04-12T10:33:18Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Pastiche Master: Exemplar-Based High-Resolution Portrait Style Transfer [103.54337984566877]
Recent studies on StyleGAN show high performance on artistic portrait generation by transfer learning with limited data.
We introduce a novel DualStyleGAN with flexible control of dual styles of the original face domain and the extended artistic portrait domain.
Experiments demonstrate the superiority of DualStyleGAN over state-of-the-art methods in high-quality portrait style transfer and flexible style control.
arXiv Detail & Related papers (2022-03-24T17:57:11Z) - SAFIN: Arbitrary Style Transfer With Self-Attentive Factorized Instance
Normalization [71.85169368997738]
Artistic style transfer aims to transfer the style characteristics of one image onto another image while retaining its content.
Self-Attention-based approaches have tackled this issue with partial success but suffer from unwanted artifacts.
This paper aims to combine the best of both worlds: self-attention and normalization.
arXiv Detail & Related papers (2021-05-13T08:01:01Z) - Art Style Classification with Self-Trained Ensemble of AutoEncoding
Transformations [5.835728107167379]
Artistic style of a painting is a rich descriptor that reveals both visual and deep intrinsic knowledge about how an artist uniquely portrays and expresses their creative vision.
In this paper, we investigate the use of deep self-supervised learning methods to solve the problem of recognizing complex artistic styles with high intra-class and low inter-class variation.
arXiv Detail & Related papers (2020-12-06T21:05:23Z) - Joint Bilateral Learning for Real-time Universal Photorealistic Style
Transfer [18.455002563426262]
Photorealistic style transfer is the task of transferring the artistic style of an image onto a content target, producing a result that is plausibly taken with a camera.
Recent approaches, based on deep neural networks, produce impressive results but are either too slow to run at practical resolutions, or still contain objectionable artifacts.
We propose a new end-to-end model for photorealistic style transfer that is both fast and inherently generates photorealistic results.
arXiv Detail & Related papers (2020-04-23T03:31:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.