Arbitrary Style Transfer using Graph Instance Normalization
- URL: http://arxiv.org/abs/2010.02560v1
- Date: Tue, 6 Oct 2020 09:07:20 GMT
- Title: Arbitrary Style Transfer using Graph Instance Normalization
- Authors: Dongki Jung, Seunghan Yang, Jaehoon Choi, Changick Kim
- Abstract summary: Style transfer is the task which applies a style of one image to another while preserving the content.
In statistical methods, the adaptive instance normalization (AdaIN) whitens the source images and applies the style of target images through normalizing the mean and variance of features.
We present a novel learnable normalization technique for style transfer using graph convolutional networks, termed Graph Instance Normalization (GrIN)
- Score: 25.05195837557028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Style transfer is the image synthesis task, which applies a style of one
image to another while preserving the content. In statistical methods, the
adaptive instance normalization (AdaIN) whitens the source images and applies
the style of target images through normalizing the mean and variance of
features. However, computing feature statistics for each instance would neglect
the inherent relationship between features, so it is hard to learn global
styles while fitting to the individual training dataset. In this paper, we
present a novel learnable normalization technique for style transfer using
graph convolutional networks, termed Graph Instance Normalization (GrIN). This
algorithm makes the style transfer approach more robust by taking into account
similar information shared between instances. Besides, this simple module is
also applicable to other tasks like image-to-image translation or domain
adaptation.
Related papers
- A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - DSI2I: Dense Style for Unpaired Image-to-Image Translation [70.93865212275412]
Unpaired exemplar-based image-to-image (UEI2I) translation aims to translate a source image to a target image domain with the style of a target image exemplar.
We propose to represent style as a dense feature map, allowing for a finer-grained transfer to the source image without requiring any external semantic information.
Our results show that the translations produced by our approach are more diverse, preserve the source content better, and are closer to the exemplars when compared to the state-of-the-art methods.
arXiv Detail & Related papers (2022-12-26T18:45:25Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - Non-Parametric Style Transfer [0.9137554315375919]
Recent feed-forward neural methods of arbitrary image style transfer mainly utilized encoded feature map upto its second-order statistics.
We extend the second-order statistical feature matching into a general distribution matching based on the understanding that style of an image is represented by the distribution of responses from receptive fields.
Based on our results, it is proven that the stylized images obtained with our method are more similar with the target style images in all existing style measures without losing content clearness.
arXiv Detail & Related papers (2022-06-26T16:34:37Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - STALP: Style Transfer with Auxiliary Limited Pairing [36.23393954839379]
We present an approach to example-based stylization of images that uses a single pair of a source image and its stylized counterpart.
We demonstrate how to train an image translation network that can perform real-time semantically meaningful style transfer to a set of target images.
arXiv Detail & Related papers (2021-10-20T11:38:41Z) - Domain Generalization with MixStyle [120.52367818581608]
Domain generalization aims to address this problem by learning from a set of source domains a model that is generalizable to any unseen domain.
Our method, termed MixStyle, is motivated by the observation that visual domain is closely related to image style.
MixStyle fits into mini-batch training perfectly and is extremely easy to implement.
arXiv Detail & Related papers (2021-04-05T16:58:09Z) - Permuted AdaIN: Reducing the Bias Towards Global Statistics in Image
Classification [97.81205777897043]
Recent work has shown that convolutional neural network classifiers overly rely on texture at the expense of shape cues.
We make a similar but different distinction between shape and local image cues, on the one hand, and global image statistics, on the other.
Our method, called Permuted Adaptive Instance Normalization (pAdaIN), reduces the representation of global statistics in the hidden layers of image classifiers.
arXiv Detail & Related papers (2020-10-09T16:38:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.