A Content Transformation Block For Image Style Transfer
- URL: http://arxiv.org/abs/2003.08407v1
- Date: Wed, 18 Mar 2020 18:00:23 GMT
- Title: A Content Transformation Block For Image Style Transfer
- Authors: Dmytro Kotovenko, Artsiom Sanakoyeu, Pingchuan Ma, Sabine Lang,
Bj\"orn Ommer
- Abstract summary: This paper explicitly focuses on a content-and style-aware stylization of a content image.
We utilize similar content appearing in photographs and style samples to learn how style alters content details.
The robustness and speed of our model enables a video stylization in real-time and high definition.
- Score: 16.25958537802466
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Style transfer has recently received a lot of attention, since it allows to
study fundamental challenges in image understanding and synthesis. Recent work
has significantly improved the representation of color and texture and
computational speed and image resolution. The explicit transformation of image
content has, however, been mostly neglected: while artistic style affects
formal characteristics of an image, such as color, shape or texture, it also
deforms, adds or removes content details. This paper explicitly focuses on a
content-and style-aware stylization of a content image. Therefore, we introduce
a content transformation module between the encoder and decoder. Moreover, we
utilize similar content appearing in photographs and style samples to learn how
style alters content details and we generalize this to other class details.
Additionally, this work presents a novel normalization layer critical for high
resolution image synthesis. The robustness and speed of our model enables a
video stylization in real-time and high definition. We perform extensive
qualitative and quantitative evaluations to demonstrate the validity of our
approach.
Related papers
- AEANet: Affinity Enhanced Attentional Networks for Arbitrary Style Transfer [4.639424509503966]
A research area that combines rational academic study with emotive artistic creation.
It aims to create a new image from a content image according to a target artistic style, maintaining the content's textural structural information.
Existing style transfer methods often significantly damage the texture lines of the content image during the style transformation.
We propose affinity-enhanced attentional network, which include the content affinity-enhanced attention (CAEA) module, the style affinity-enhanced attention (SAEA) module, and the hybrid attention (HA) module.
arXiv Detail & Related papers (2024-09-23T01:39:11Z) - Puff-Net: Efficient Style Transfer with Pure Content and Style Feature Fusion Network [32.12413686394824]
Style transfer aims to render an image with the artistic features of a style image, while maintaining the original structure.
It is difficult for CNN-based methods to handle global information and long-range dependencies between input images.
We propose a novel network termed Puff-Net, i.e., pure content and style feature fusion network.
arXiv Detail & Related papers (2024-05-30T07:41:07Z) - ControlStyle: Text-Driven Stylized Image Generation Using Diffusion
Priors [105.37795139586075]
We propose a new task for stylizing'' text-to-image models, namely text-driven stylized image generation.
We present a new diffusion model (ControlStyle) via upgrading a pre-trained text-to-image model with a trainable modulation network.
Experiments demonstrate the effectiveness of our ControlStyle in producing more visually pleasing and artistic results.
arXiv Detail & Related papers (2023-11-09T15:50:52Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - InfoStyler: Disentanglement Information Bottleneck for Artistic Style
Transfer [22.29381866838179]
Artistic style transfer aims to transfer the style of an artwork to a photograph while maintaining its original overall content.
We propose a novel information disentanglement method, named InfoStyler, to capture the minimal sufficient information for both content and style representations.
arXiv Detail & Related papers (2023-07-30T13:38:56Z) - ALADIN-NST: Self-supervised disentangled representation learning of
artistic style through Neural Style Transfer [60.6863849241972]
We learn a representation of visual artistic style more strongly disentangled from the semantic content depicted in an image.
We show that strongly addressing the disentanglement of style and content leads to large gains in style-specific metrics.
arXiv Detail & Related papers (2023-04-12T10:33:18Z) - Edge Enhanced Image Style Transfer via Transformers [6.666912981102909]
arbitrary image style transfer has attracted more and more attention.
It is difficult to simultaneously keep well the trade-off between the content details and the style features.
We present a new transformer-based method named STT for image style transfer and an edge loss.
arXiv Detail & Related papers (2023-01-02T10:39:31Z) - DiffStyler: Controllable Dual Diffusion for Text-Driven Image
Stylization [66.42741426640633]
DiffStyler is a dual diffusion processing architecture to control the balance between the content and style of diffused results.
We propose a content image-based learnable noise on which the reverse denoising process is based, enabling the stylization results to better preserve the structure information of the content image.
arXiv Detail & Related papers (2022-11-19T12:30:44Z) - Arbitrary Style Transfer with Structure Enhancement by Combining the
Global and Local Loss [51.309905690367835]
We introduce a novel arbitrary style transfer method with structure enhancement by combining the global and local loss.
Experimental results demonstrate that our method can generate higher-quality images with impressive visual effects.
arXiv Detail & Related papers (2022-07-23T07:02:57Z) - Arbitrary Style Transfer via Multi-Adaptation Network [109.6765099732799]
A desired style transfer, given a content image and referenced style painting, would render the content image with the color tone and vivid stroke patterns of the style painting.
A new disentanglement loss function enables our network to extract main style patterns and exact content structures to adapt to various input images.
arXiv Detail & Related papers (2020-05-27T08:00:22Z) - Parameter-Free Style Projection for Arbitrary Style Transfer [64.06126075460722]
This paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation.
This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer.
arXiv Detail & Related papers (2020-03-17T13:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.