Style is a Distribution of Features
- URL: http://arxiv.org/abs/2007.13010v1
- Date: Sat, 25 Jul 2020 21:17:51 GMT
- Title: Style is a Distribution of Features
- Authors: Eddie Huang, Sahil Gupta
- Abstract summary: Neural style transfer is an image generation technique that uses a convolutional neural network (CNN) to merge the content of one image with the style of another.
We present a new algorithm for style transfer that fully extracts the style from the features by redefining the style loss as the Wasserstein distance between the distribution of features.
- Score: 2.398608007786179
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural style transfer (NST) is a powerful image generation technique that
uses a convolutional neural network (CNN) to merge the content of one image
with the style of another. Contemporary methods of NST use first or second
order statistics of the CNN's features to achieve transfers with relatively
little computational cost. However, these methods cannot fully extract the
style from the CNN's features. We present a new algorithm for style transfer
that fully extracts the style from the features by redefining the style loss as
the Wasserstein distance between the distribution of features. Thus, we set a
new standard in style transfer quality. In addition, we state two important
interpretations of NST. The first is a re-emphasis from Li et al., which states
that style is simply the distribution of features. The second states that NST
is a type of generative adversarial network (GAN) problem.
Related papers
- A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Conditional Invertible Neural Networks for Diverse Image-to-Image
Translation [33.262390365990896]
We introduce a conditional invertible neural network (cINN) to address the task of diverse image-to-image translation for natural images.
The cINN combines the purely generative INN model with an unconstrained feed-forward network, which efficiently preprocesses the conditioning image into maximally informative features.
arXiv Detail & Related papers (2021-05-05T15:10:37Z) - In the light of feature distributions: moment matching for Neural Style
Transfer [27.25600860698314]
Style transfer aims to render the content of a given image in the graphical/artistic style of another image.
We show that most current implementations of that concept have important theoretical and practical limitations.
We propose a novel approach that matches the desired style more precisely, while still being computationally efficient.
arXiv Detail & Related papers (2021-03-12T11:00:44Z) - The Mind's Eye: Visualizing Class-Agnostic Features of CNNs [92.39082696657874]
We propose an approach to visually interpret CNN features given a set of images by creating corresponding images that depict the most informative features of a specific layer.
Our method uses a dual-objective activation and distance loss, without requiring a generator network nor modifications to the original model.
arXiv Detail & Related papers (2021-01-29T07:46:39Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - Geometric Style Transfer [74.58782301514053]
We introduce a neural architecture that supports transfer of geometric style.
New architecture runs prior to a network that transfers texture style.
Users can input content/style pair as is common, or they can chose to input a content/texture-style/geometry-style triple.
arXiv Detail & Related papers (2020-07-10T16:33:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.