Geometric Style Transfer
- URL: http://arxiv.org/abs/2007.05471v1
- Date: Fri, 10 Jul 2020 16:33:23 GMT
- Title: Geometric Style Transfer
- Authors: Xiao-Chang Liu, Xuan-Yi Li, Ming-Ming Cheng, Peter Hall
- Abstract summary: We introduce a neural architecture that supports transfer of geometric style.
New architecture runs prior to a network that transfers texture style.
Users can input content/style pair as is common, or they can chose to input a content/texture-style/geometry-style triple.
- Score: 74.58782301514053
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural style transfer (NST), where an input image is rendered in the style of
another image, has been a topic of considerable progress in recent years.
Research over that time has been dominated by transferring aspects of color and
texture, yet these factors are only one component of style. Other factors of
style include composition, the projection system used, and the way in which
artists warp and bend objects. Our contribution is to introduce a neural
architecture that supports transfer of geometric style. Unlike recent work in
this area, we are unique in being general in that we are not restricted by
semantic content. This new architecture runs prior to a network that transfers
texture style, enabling us to transfer texture to a warped image. This form of
network supports a second novelty: we extend the NST input paradigm. Users can
input content/style pair as is common, or they can chose to input a
content/texture-style/geometry-style triple. This three image input paradigm
divides style into two parts and so provides significantly greater versatility
to the output we can produce. We provide user studies that show the quality of
our output, and quantify the importance of geometric style transfer to style
recognition by humans.
Related papers
- DIFF-NST: Diffusion Interleaving For deFormable Neural Style Transfer [27.39248034592382]
We propose using a new class of models to perform style transfer while enabling deformable style transfer.
We show how leveraging the priors of these models can expose new artistic controls at inference time.
arXiv Detail & Related papers (2023-07-09T12:13:43Z) - Any-to-Any Style Transfer: Making Picasso and Da Vinci Collaborate [58.83278629019384]
Style transfer aims to render the style of a given image for style reference to another given image for content reference.
Existing approaches either apply the holistic style of the style image in a global manner, or migrate local colors and textures of the style image to the content counterparts in a pre-defined way.
We propose Any-to-Any Style Transfer, which enables users to interactively select styles of regions in the style image and apply them to the prescribed content regions.
arXiv Detail & Related papers (2023-04-19T15:15:36Z) - 3D Face Arbitrary Style Transfer [18.09280257466941]
We propose a novel method, namely Face-guided Dual Style Transfer (FDST)
FDST employs a 3D decoupling module to separate facial geometry and texture.
We show that FDST can be applied in many downstream tasks, including region-controllable style transfer, high-fidelity face texture reconstruction, and artistic face reconstruction.
arXiv Detail & Related papers (2023-03-14T08:51:51Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Artistic Arbitrary Style Transfer [1.1279808969568252]
Arbitrary Style Transfer is a technique used to produce a new image from two images: a content image, and a style image.
Balancing the structure and style components has been the major challenge that other state-of-the-art algorithms have tried to solve.
In this work, we solved these problems by using a Deep Learning approach using Convolutional Neural Networks.
arXiv Detail & Related papers (2022-12-21T21:34:00Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - SAFIN: Arbitrary Style Transfer With Self-Attentive Factorized Instance
Normalization [71.85169368997738]
Artistic style transfer aims to transfer the style characteristics of one image onto another image while retaining its content.
Self-Attention-based approaches have tackled this issue with partial success but suffer from unwanted artifacts.
This paper aims to combine the best of both worlds: self-attention and normalization.
arXiv Detail & Related papers (2021-05-13T08:01:01Z) - Exemplar-Based 3D Portrait Stylization [23.585334925548064]
We present the first framework for one-shot 3D portrait style transfer.
It can generate 3D face models with both the geometry exaggerated and the texture stylized.
Our method achieves robustly good results on different artistic styles and outperforms existing methods.
arXiv Detail & Related papers (2021-04-29T17:59:54Z) - 3DSNet: Unsupervised Shape-to-Shape 3D Style Transfer [66.48720190245616]
We propose a learning-based approach for style transfer between 3D objects.
The proposed method can synthesize new 3D shapes both in the form of point clouds and meshes.
We extend our technique to implicitly learn the multimodal style distribution of the chosen domains.
arXiv Detail & Related papers (2020-11-26T16:59:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.