Non-Parametric Style Transfer
- URL: http://arxiv.org/abs/2206.12921v1
- Date: Sun, 26 Jun 2022 16:34:37 GMT
- Title: Non-Parametric Style Transfer
- Authors: Jeong-Sik Lee, Hyun-Chul Choi
- Abstract summary: Recent feed-forward neural methods of arbitrary image style transfer mainly utilized encoded feature map upto its second-order statistics.
We extend the second-order statistical feature matching into a general distribution matching based on the understanding that style of an image is represented by the distribution of responses from receptive fields.
Based on our results, it is proven that the stylized images obtained with our method are more similar with the target style images in all existing style measures without losing content clearness.
- Score: 0.9137554315375919
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Recent feed-forward neural methods of arbitrary image style transfer mainly
utilized encoded feature map upto its second-order statistics, i.e., linearly
transformed the encoded feature map of a content image to have the same mean
and variance (or covariance) of a target style feature map. In this work, we
extend the second-order statistical feature matching into a general
distribution matching based on the understanding that style of an image is
represented by the distribution of responses from receptive fields. For this
generalization, first, we propose a new feature transform layer that exactly
matches the feature map distribution of content image into that of target style
image. Second, we analyze the recent style losses consistent with our new
feature transform layer to train a decoder network which generates a style
transferred image from the transformed feature map. Based on our experimental
results, it is proven that the stylized images obtained with our method are
more similar with the target style images in all existing style measures
without losing content clearness.
Related papers
- A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Neural Style Transfer for Vector Graphics [3.8983556368110226]
Style transfer between vector images has not been considered.
Applying standard content and style losses insignificantly changes the vector image drawing style.
New method based on differentiableization can change the color and shape parameters of the content image corresponding to the drawing of the style image.
arXiv Detail & Related papers (2023-03-06T16:57:45Z) - DSI2I: Dense Style for Unpaired Image-to-Image Translation [70.93865212275412]
Unpaired exemplar-based image-to-image (UEI2I) translation aims to translate a source image to a target image domain with the style of a target image exemplar.
We propose to represent style as a dense feature map, allowing for a finer-grained transfer to the source image without requiring any external semantic information.
Our results show that the translations produced by our approach are more diverse, preserve the source content better, and are closer to the exemplars when compared to the state-of-the-art methods.
arXiv Detail & Related papers (2022-12-26T18:45:25Z) - Diffusion-based Image Translation using Disentangled Style and Content
Representation [51.188396199083336]
Diffusion-based image translation guided by semantic texts or a single target image has enabled flexible style transfer.
It is often difficult to maintain the original content of the image during the reverse diffusion.
We present a novel diffusion-based unsupervised image translation method using disentangled style and content representation.
Our experimental results show that the proposed method outperforms state-of-the-art baseline models in both text-guided and image-guided translation tasks.
arXiv Detail & Related papers (2022-09-30T06:44:37Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - $\texttt{GradICON}$: Approximate Diffeomorphisms via Gradient Inverse
Consistency [16.72466200341455]
We use a neural network to predict a map between a source and a target image as well as the map when swapping the source and target images.
We achieve state-of-the-art registration performance on a variety of real-world medical image datasets.
arXiv Detail & Related papers (2022-06-13T04:03:49Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Saliency Constrained Arbitrary Image Style Transfer using SIFT and DCNN [22.57205921266602]
When common neural style transfer methods are used, the textures and colors in the style image are usually transferred imperfectly to the content image.
This paper proposes a novel saliency constrained method to reduce or avoid such effects.
The experiments show that the saliency maps of source images can help find the correct matching and avoid artifacts.
arXiv Detail & Related papers (2022-01-14T09:00:55Z) - Arbitrary Style Transfer using Graph Instance Normalization [25.05195837557028]
Style transfer is the task which applies a style of one image to another while preserving the content.
In statistical methods, the adaptive instance normalization (AdaIN) whitens the source images and applies the style of target images through normalizing the mean and variance of features.
We present a novel learnable normalization technique for style transfer using graph convolutional networks, termed Graph Instance Normalization (GrIN)
arXiv Detail & Related papers (2020-10-06T09:07:20Z) - Manifold Alignment for Semantically Aligned Style Transfer [61.1274057338588]
We make a new assumption that image features from the same semantic region form a manifold and an image with multiple semantic regions follows a multi-manifold distribution.
Based on this assumption, the style transfer problem is formulated as aligning two multi-manifold distributions.
The proposed framework allows semantically similar regions between the output and the style image share similar style patterns.
arXiv Detail & Related papers (2020-05-21T16:52:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.