P$^2$-GAN: Efficient Style Transfer Using Single Style Image
- URL: http://arxiv.org/abs/2001.07466v2
- Date: Thu, 30 Jan 2020 16:37:22 GMT
- Title: P$^2$-GAN: Efficient Style Transfer Using Single Style Image
- Authors: Zhentan Zheng, Jianyi Liu
- Abstract summary: Style transfer is a useful image synthesis technique that can re-render given image into another artistic style.
Generative Adversarial Network (GAN) is a widely adopted framework toward this task for its better representation ability on local style patterns.
In this paper, a novel Patch Permutation GAN (P$2$-GAN) network that can efficiently learn the stroke style from a single style image is proposed.
- Score: 2.703193151632043
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Style transfer is a useful image synthesis technique that can re-render given
image into another artistic style while preserving its content information.
Generative Adversarial Network (GAN) is a widely adopted framework toward this
task for its better representation ability on local style patterns than the
traditional Gram-matrix based methods. However, most previous methods rely on
sufficient amount of pre-collected style images to train the model. In this
paper, a novel Patch Permutation GAN (P$^2$-GAN) network that can efficiently
learn the stroke style from a single style image is proposed. We use patch
permutation to generate multiple training samples from the given style image. A
patch discriminator that can simultaneously process patch-wise images and
natural images seamlessly is designed. We also propose a local texture
descriptor based criterion to quantitatively evaluate the style transfer
quality. Experimental results showed that our method can produce finer quality
re-renderings from single style image with improved computational efficiency
compared with many state-of-the-arts methods.
Related papers
- Any-to-Any Style Transfer: Making Picasso and Da Vinci Collaborate [58.83278629019384]
Style transfer aims to render the style of a given image for style reference to another given image for content reference.
Existing approaches either apply the holistic style of the style image in a global manner, or migrate local colors and textures of the style image to the content counterparts in a pre-defined way.
We propose Any-to-Any Style Transfer, which enables users to interactively select styles of regions in the style image and apply them to the prescribed content regions.
arXiv Detail & Related papers (2023-04-19T15:15:36Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - DiffStyler: Controllable Dual Diffusion for Text-Driven Image
Stylization [66.42741426640633]
DiffStyler is a dual diffusion processing architecture to control the balance between the content and style of diffused results.
We propose a content image-based learnable noise on which the reverse denoising process is based, enabling the stylization results to better preserve the structure information of the content image.
arXiv Detail & Related papers (2022-11-19T12:30:44Z) - MultiStyleGAN: Multiple One-shot Image Stylizations using a Single GAN [14.373091259972666]
A common scenario is one-shot stylization, where only one example is available for each reference style.
Recent approaches for one-shot stylization such as JoJoGAN fine-tune a pre-trained StyleGAN2 generator on a single style reference image.
We present a MultiStyleGAN method that is capable of producing multiple different stylizations at once by fine-tuning a single generator.
arXiv Detail & Related papers (2022-10-08T23:05:29Z) - Learning Diverse Tone Styles for Image Retouching [73.60013618215328]
We propose to learn diverse image retouching with normalizing flow-based architectures.
A joint-training pipeline is composed of a style encoder, a conditional RetouchNet, and the image tone style normalizing flow (TSFlow) module.
Our proposed method performs favorably against state-of-the-art methods and is effective in generating diverse results.
arXiv Detail & Related papers (2022-07-12T09:49:21Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Saliency Constrained Arbitrary Image Style Transfer using SIFT and DCNN [22.57205921266602]
When common neural style transfer methods are used, the textures and colors in the style image are usually transferred imperfectly to the content image.
This paper proposes a novel saliency constrained method to reduce or avoid such effects.
The experiments show that the saliency maps of source images can help find the correct matching and avoid artifacts.
arXiv Detail & Related papers (2022-01-14T09:00:55Z) - Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality
Artistic Style Transfer [115.13853805292679]
Artistic style transfer aims at migrating the style from an example image to a content image.
Inspired by the common painting process of drawing a draft and revising the details, we introduce a novel feed-forward method named Laplacian Pyramid Network (LapStyle)
Our method can synthesize high quality stylized images in real time, where holistic style patterns are properly transferred.
arXiv Detail & Related papers (2021-04-12T11:53:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.