Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality
Artistic Style Transfer
- URL: http://arxiv.org/abs/2104.05376v1
- Date: Mon, 12 Apr 2021 11:53:53 GMT
- Title: Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality
Artistic Style Transfer
- Authors: Tianwei Lin, Zhuoqi Ma, Fu Li, Dongliang He, Xin Li, Errui Ding,
Nannan Wang, Jie Li, Xinbo Gao
- Abstract summary: Artistic style transfer aims at migrating the style from an example image to a content image.
Inspired by the common painting process of drawing a draft and revising the details, we introduce a novel feed-forward method named Laplacian Pyramid Network (LapStyle)
Our method can synthesize high quality stylized images in real time, where holistic style patterns are properly transferred.
- Score: 115.13853805292679
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Artistic style transfer aims at migrating the style from an example image to
a content image. Currently, optimization-based methods have achieved great
stylization quality, but expensive time cost restricts their practical
applications. Meanwhile, feed-forward methods still fail to synthesize complex
style, especially when holistic global and local patterns exist. Inspired by
the common painting process of drawing a draft and revising the details, we
introduce a novel feed-forward method named Laplacian Pyramid Network
(LapStyle). LapStyle first transfers global style patterns in low-resolution
via a Drafting Network. It then revises the local details in high-resolution
via a Revision Network, which hallucinates a residual image according to the
draft and the image textures extracted by Laplacian filtering. Higher
resolution details can be easily generated by stacking Revision Networks with
multiple Laplacian pyramid levels. The final stylized image is obtained by
aggregating outputs of all pyramid levels. %We also introduce a patch
discriminator to better learn local patterns adversarially. Experiments
demonstrate that our method can synthesize high quality stylized images in real
time, where holistic style patterns are properly transferred.
Related papers
- Puff-Net: Efficient Style Transfer with Pure Content and Style Feature Fusion Network [32.12413686394824]
Style transfer aims to render an image with the artistic features of a style image, while maintaining the original structure.
It is difficult for CNN-based methods to handle global information and long-range dependencies between input images.
We propose a novel network termed Puff-Net, i.e., pure content and style feature fusion network.
arXiv Detail & Related papers (2024-05-30T07:41:07Z) - Locally Stylized Neural Radiance Fields [30.037649804991315]
We propose a stylization framework for neural radiance fields (NeRF) based on local style transfer.
In particular, we use a hash-grid encoding to learn the embedding of the appearance and geometry components.
We show that our method yields plausible stylization results with novel view synthesis.
arXiv Detail & Related papers (2023-09-19T15:08:10Z) - Scaling Painting Style Transfer [10.059627473725508]
Neural style transfer (NST) is a technique that produces an unprecedentedly rich style transfer from a style image to a content image.
This paper presents a solution to solve the original global optimization for ultra-high resolution (UHR) images.
We show that our method produces style transfer of unmatched quality for such high-resolution painting styles.
arXiv Detail & Related papers (2022-12-27T12:03:38Z) - Arbitrary Style Transfer with Structure Enhancement by Combining the
Global and Local Loss [51.309905690367835]
We introduce a novel arbitrary style transfer method with structure enhancement by combining the global and local loss.
Experimental results demonstrate that our method can generate higher-quality images with impressive visual effects.
arXiv Detail & Related papers (2022-07-23T07:02:57Z) - ARF: Artistic Radiance Fields [63.79314417413371]
We present a method for transferring the artistic features of an arbitrary style image to a 3D scene.
Previous methods that perform 3D stylization on point clouds or meshes are sensitive to geometric reconstruction errors.
We propose to stylize the more robust radiance field representation.
arXiv Detail & Related papers (2022-06-13T17:55:31Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Controllable Person Image Synthesis with Spatially-Adaptive Warped
Normalization [72.65828901909708]
Controllable person image generation aims to produce realistic human images with desirable attributes.
We introduce a novel Spatially-Adaptive Warped Normalization (SAWN), which integrates a learned flow-field to warp modulation parameters.
We propose a novel self-training part replacement strategy to refine the pretrained model for the texture-transfer task.
arXiv Detail & Related papers (2021-05-31T07:07:44Z) - Real-time Universal Style Transfer on High-resolution Images via
Zero-channel Pruning [74.09149955786367]
ArtNet can achieve universal, real-time, and high-quality style transfer on high-resolution images simultaneously.
By using ArtNet and S2, our method is 2.3 to 107.4 times faster than state-of-the-art approaches.
arXiv Detail & Related papers (2020-06-16T09:50:14Z) - P$^2$-GAN: Efficient Style Transfer Using Single Style Image [2.703193151632043]
Style transfer is a useful image synthesis technique that can re-render given image into another artistic style.
Generative Adversarial Network (GAN) is a widely adopted framework toward this task for its better representation ability on local style patterns.
In this paper, a novel Patch Permutation GAN (P$2$-GAN) network that can efficiently learn the stroke style from a single style image is proposed.
arXiv Detail & Related papers (2020-01-21T12:08:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.