Deformable Style Transfer
- URL: http://arxiv.org/abs/2003.11038v2
- Date: Mon, 20 Jul 2020 02:50:28 GMT
- Title: Deformable Style Transfer
- Authors: Sunnie S. Y. Kim, Nicholas Kolkin, Jason Salavon, Gregory
Shakhnarovich
- Abstract summary: We propose deformable style transfer (DST), an optimization-based approach that jointly stylizes the texture and geometry of a content image to better match a style image.
We demonstrate our method on a diverse set of content and style images including portraits, animals, objects, scenes, and paintings.
- Score: 14.729482749508374
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Both geometry and texture are fundamental aspects of visual style. Existing
style transfer methods, however, primarily focus on texture, almost entirely
ignoring geometry. We propose deformable style transfer (DST), an
optimization-based approach that jointly stylizes the texture and geometry of a
content image to better match a style image. Unlike previous geometry-aware
stylization methods, our approach is neither restricted to a particular domain
(such as human faces), nor does it require training sets of matching
style/content pairs. We demonstrate our method on a diverse set of content and
style images including portraits, animals, objects, scenes, and paintings. Code
has been made publicly available at https://github.com/sunniesuhyoung/DST.
Related papers
- PS-StyleGAN: Illustrative Portrait Sketching using Attention-Based Style Adaptation [0.0]
Portrait sketching involves capturing identity specific attributes of a real face with abstract lines and shades.
This paper introduces textbfPortrait Sketching StyleGAN (PS-StyleGAN), a style transfer approach tailored for portrait sketch synthesis.
We leverage the semantic $W+$ latent space of StyleGAN to generate portrait sketches, allowing us to make meaningful edits, like pose and expression alterations, without compromising identity.
arXiv Detail & Related papers (2024-08-31T04:22:45Z) - Portrait Diffusion: Training-free Face Stylization with
Chain-of-Painting [64.43760427752532]
Face stylization refers to the transformation of a face into a specific portrait style.
Current methods require the use of example-based adaptation approaches to fine-tune pre-trained generative models.
This paper proposes a training-free face stylization framework, named Portrait Diffusion.
arXiv Detail & Related papers (2023-12-03T06:48:35Z) - Any-to-Any Style Transfer: Making Picasso and Da Vinci Collaborate [58.83278629019384]
Style transfer aims to render the style of a given image for style reference to another given image for content reference.
Existing approaches either apply the holistic style of the style image in a global manner, or migrate local colors and textures of the style image to the content counterparts in a pre-defined way.
We propose Any-to-Any Style Transfer, which enables users to interactively select styles of regions in the style image and apply them to the prescribed content regions.
arXiv Detail & Related papers (2023-04-19T15:15:36Z) - 3D Face Arbitrary Style Transfer [18.09280257466941]
We propose a novel method, namely Face-guided Dual Style Transfer (FDST)
FDST employs a 3D decoupling module to separate facial geometry and texture.
We show that FDST can be applied in many downstream tasks, including region-controllable style transfer, high-fidelity face texture reconstruction, and artistic face reconstruction.
arXiv Detail & Related papers (2023-03-14T08:51:51Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Realtime Fewshot Portrait Stylization Based On Geometric Alignment [32.224973317381426]
This paper presents a portrait stylization method designed for real-time mobile applications with limited style examples available.
Previous learning based stylization methods suffer from the geometric and semantic gaps between portrait domain and style domain.
Based on the geometric prior of human facial attributions, we propose to utilize geometric alignment to tackle this issue.
arXiv Detail & Related papers (2022-11-28T16:53:19Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - Exemplar-Based 3D Portrait Stylization [23.585334925548064]
We present the first framework for one-shot 3D portrait style transfer.
It can generate 3D face models with both the geometry exaggerated and the texture stylized.
Our method achieves robustly good results on different artistic styles and outperforms existing methods.
arXiv Detail & Related papers (2021-04-29T17:59:54Z) - Geometric Style Transfer [74.58782301514053]
We introduce a neural architecture that supports transfer of geometric style.
New architecture runs prior to a network that transfers texture style.
Users can input content/style pair as is common, or they can chose to input a content/texture-style/geometry-style triple.
arXiv Detail & Related papers (2020-07-10T16:33:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.