DIFF-NST: Diffusion Interleaving For deFormable Neural Style Transfer
- URL: http://arxiv.org/abs/2307.04157v2
- Date: Tue, 11 Jul 2023 09:28:36 GMT
- Title: DIFF-NST: Diffusion Interleaving For deFormable Neural Style Transfer
- Authors: Dan Ruta, Gemma Canet Tarr\'es, Andrew Gilbert, Eli Shechtman,
Nicholas Kolkin, John Collomosse
- Abstract summary: We propose using a new class of models to perform style transfer while enabling deformable style transfer.
We show how leveraging the priors of these models can expose new artistic controls at inference time.
- Score: 27.39248034592382
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Style Transfer (NST) is the field of study applying neural techniques
to modify the artistic appearance of a content image to match the style of a
reference style image. Traditionally, NST methods have focused on texture-based
image edits, affecting mostly low level information and keeping most image
structures the same. However, style-based deformation of the content is
desirable for some styles, especially in cases where the style is abstract or
the primary concept of the style is in its deformed rendition of some content.
With the recent introduction of diffusion models, such as Stable Diffusion, we
can access far more powerful image generation techniques, enabling new
possibilities. In our work, we propose using this new class of models to
perform style transfer while enabling deformable style transfer, an elusive
capability in previous models. We show how leveraging the priors of these
models can expose new artistic controls at inference time, and we document our
findings in exploring this new direction for the field of style transfer.
Related papers
- MuseumMaker: Continual Style Customization without Catastrophic Forgetting [50.12727620780213]
We propose MuseumMaker, a method that enables the synthesis of images by following a set of customized styles in a never-end manner.
When facing with a new customization style, we develop a style distillation loss module to extract and learn the styles of the training data for new image generation.
It can minimize the learning biases caused by content of new training images, and address the catastrophic overfitting issue induced by few-shot images.
arXiv Detail & Related papers (2024-04-25T13:51:38Z) - Towards Highly Realistic Artistic Style Transfer via Stable Diffusion with Step-aware and Layer-aware Prompt [12.27693060663517]
Artistic style transfer aims to transfer the learned artistic style onto an arbitrary content image, generating artistic stylized images.
We propose a novel pre-trained diffusion-based artistic style transfer method, called LSAST.
Our proposed method can generate more highly realistic artistic stylized images than the state-of-the-art artistic style transfer methods.
arXiv Detail & Related papers (2024-04-17T15:28:53Z) - ControlStyle: Text-Driven Stylized Image Generation Using Diffusion
Priors [105.37795139586075]
We propose a new task for stylizing'' text-to-image models, namely text-driven stylized image generation.
We present a new diffusion model (ControlStyle) via upgrading a pre-trained text-to-image model with a trainable modulation network.
Experiments demonstrate the effectiveness of our ControlStyle in producing more visually pleasing and artistic results.
arXiv Detail & Related papers (2023-11-09T15:50:52Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - NeAT: Neural Artistic Tracing for Beautiful Style Transfer [29.38791171225834]
Style transfer is the task of reproducing semantic contents of a source image in the artistic style of a second target image.
We present NeAT, a new state-of-the art feed-forward style transfer method.
We use BBST-4M to improve and measure the generalization of NeAT across a huge variety of styles.
arXiv Detail & Related papers (2023-04-11T11:08:13Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Neural Artistic Style Transfer with Conditional Adversaria [0.0]
A neural artistic style transformation model can modify the appearance of a simple image by adding the style of a famous image.
In this paper, we present two methods that step toward the style image independent neural style transfer model.
Our novel contribution is a unidirectional-GAN model that ensures the Cyclic consistency by the model architecture.
arXiv Detail & Related papers (2023-02-08T04:34:20Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Pastiche Master: Exemplar-Based High-Resolution Portrait Style Transfer [103.54337984566877]
Recent studies on StyleGAN show high performance on artistic portrait generation by transfer learning with limited data.
We introduce a novel DualStyleGAN with flexible control of dual styles of the original face domain and the extended artistic portrait domain.
Experiments demonstrate the superiority of DualStyleGAN over state-of-the-art methods in high-quality portrait style transfer and flexible style control.
arXiv Detail & Related papers (2022-03-24T17:57:11Z) - Geometric Style Transfer [74.58782301514053]
We introduce a neural architecture that supports transfer of geometric style.
New architecture runs prior to a network that transfers texture style.
Users can input content/style pair as is common, or they can chose to input a content/texture-style/geometry-style triple.
arXiv Detail & Related papers (2020-07-10T16:33:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.