Puff-Net: Efficient Style Transfer with Pure Content and Style Feature Fusion Network
- URL: http://arxiv.org/abs/2405.19775v1
- Date: Thu, 30 May 2024 07:41:07 GMT
- Title: Puff-Net: Efficient Style Transfer with Pure Content and Style Feature Fusion Network
- Authors: Sizhe Zheng, Pan Gao, Peng Zhou, Jie Qin,
- Abstract summary: Style transfer aims to render an image with the artistic features of a style image, while maintaining the original structure.
It is difficult for CNN-based methods to handle global information and long-range dependencies between input images.
We propose a novel network termed Puff-Net, i.e., pure content and style feature fusion network.
- Score: 32.12413686394824
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Style transfer aims to render an image with the artistic features of a style image, while maintaining the original structure. Various methods have been put forward for this task, but some challenges still exist. For instance, it is difficult for CNN-based methods to handle global information and long-range dependencies between input images, for which transformer-based methods have been proposed. Although transformers can better model the relationship between content and style images, they require high-cost hardware and time-consuming inference. To address these issues, we design a novel transformer model that includes only the encoder, thus significantly reducing the computational cost. In addition, we also find that existing style transfer methods may lead to images under-stylied or missing content. In order to achieve better stylization, we design a content feature extractor and a style feature extractor, based on which pure content and style images can be fed to the transformer. Finally, we propose a novel network termed Puff-Net, i.e., pure content and style feature fusion network. Through qualitative and quantitative experiments, we demonstrate the advantages of our model compared to state-of-the-art ones in the literature.
Related papers
- Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - Master: Meta Style Transformer for Controllable Zero-Shot and Few-Shot
Artistic Style Transfer [83.1333306079676]
In this paper, we devise a novel Transformer model termed as emphMaster specifically for style transfer.
In the proposed model, different Transformer layers share a common group of parameters, which (1) reduces the total number of parameters, (2) leads to more robust training convergence, and (3) is readily to control the degree of stylization.
Experiments demonstrate the superiority of Master under both zero-shot and few-shot style transfer settings.
arXiv Detail & Related papers (2023-04-24T04:46:39Z) - Edge Enhanced Image Style Transfer via Transformers [6.666912981102909]
arbitrary image style transfer has attracted more and more attention.
It is difficult to simultaneously keep well the trade-off between the content details and the style features.
We present a new transformer-based method named STT for image style transfer and an edge loss.
arXiv Detail & Related papers (2023-01-02T10:39:31Z) - DiffStyler: Controllable Dual Diffusion for Text-Driven Image
Stylization [66.42741426640633]
DiffStyler is a dual diffusion processing architecture to control the balance between the content and style of diffused results.
We propose a content image-based learnable noise on which the reverse denoising process is based, enabling the stylization results to better preserve the structure information of the content image.
arXiv Detail & Related papers (2022-11-19T12:30:44Z) - Line Search-Based Feature Transformation for Fast, Stable, and Tunable
Content-Style Control in Photorealistic Style Transfer [26.657485176782934]
Photorealistic style transfer is the task of synthesizing a realistic-looking image when adapting the content from one image to appear in the style of another image.
Modern models embed a transformation that fuses features describing the content image and style image and then decodes the resulting feature into a stylized image.
We introduce a general-purpose transformation that enables controlling the balance between how much content is preserved and the strength of the infused style.
arXiv Detail & Related papers (2022-10-12T08:05:49Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - StyTr^2: Unbiased Image Style Transfer with Transformers [59.34108877969477]
The goal of image style transfer is to render an image with artistic features guided by a style reference while maintaining the original content.
Traditional neural style transfer methods are usually biased and content leak can be observed by running several times of the style transfer process with the same reference image.
We propose a transformer-based approach, namely StyTr2, to address this critical issue.
arXiv Detail & Related papers (2021-05-30T15:57:09Z) - Arbitrary Style Transfer via Multi-Adaptation Network [109.6765099732799]
A desired style transfer, given a content image and referenced style painting, would render the content image with the color tone and vivid stroke patterns of the style painting.
A new disentanglement loss function enables our network to extract main style patterns and exact content structures to adapt to various input images.
arXiv Detail & Related papers (2020-05-27T08:00:22Z) - A Content Transformation Block For Image Style Transfer [16.25958537802466]
This paper explicitly focuses on a content-and style-aware stylization of a content image.
We utilize similar content appearing in photographs and style samples to learn how style alters content details.
The robustness and speed of our model enables a video stylization in real-time and high definition.
arXiv Detail & Related papers (2020-03-18T18:00:23Z) - Parameter-Free Style Projection for Arbitrary Style Transfer [64.06126075460722]
This paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation.
This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer.
arXiv Detail & Related papers (2020-03-17T13:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.