Artistic Arbitrary Style Transfer
- URL: http://arxiv.org/abs/2212.11376v1
- Date: Wed, 21 Dec 2022 21:34:00 GMT
- Title: Artistic Arbitrary Style Transfer
- Authors: Weiting Li, Rahul Vyas, Ramya Sree Penta
- Abstract summary: Arbitrary Style Transfer is a technique used to produce a new image from two images: a content image, and a style image.
Balancing the structure and style components has been the major challenge that other state-of-the-art algorithms have tried to solve.
In this work, we solved these problems by using a Deep Learning approach using Convolutional Neural Networks.
- Score: 1.1279808969568252
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Arbitrary Style Transfer is a technique used to produce a new image from two
images: a content image, and a style image. The newly produced image is unseen
and is generated from the algorithm itself. Balancing the structure and style
components has been the major challenge that other state-of-the-art algorithms
have tried to solve. Despite all the efforts, it's still a major challenge to
apply the artistic style that was originally created on top of the structure of
the content image while maintaining consistency. In this work, we solved these
problems by using a Deep Learning approach using Convolutional Neural Networks.
Our implementation will first extract foreground from the background using the
pre-trained Detectron 2 model from the content image, and then apply the
Arbitrary Style Transfer technique that is used in SANet. Once we have the two
styled images, we will stitch the two chunks of images after the process of
style transfer for the complete end piece.
Related papers
- Scaling Painting Style Transfer [10.059627473725508]
Neural style transfer (NST) is a technique that produces an unprecedentedly rich style transfer from a style image to a content image.
This paper presents a solution to solve the original global optimization for ultra-high resolution (UHR) images.
We show that our method produces style transfer of unmatched quality for such high-resolution painting styles.
arXiv Detail & Related papers (2022-12-27T12:03:38Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - Saliency Constrained Arbitrary Image Style Transfer using SIFT and DCNN [22.57205921266602]
When common neural style transfer methods are used, the textures and colors in the style image are usually transferred imperfectly to the content image.
This paper proposes a novel saliency constrained method to reduce or avoid such effects.
The experiments show that the saliency maps of source images can help find the correct matching and avoid artifacts.
arXiv Detail & Related papers (2022-01-14T09:00:55Z) - Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality
Artistic Style Transfer [115.13853805292679]
Artistic style transfer aims at migrating the style from an example image to a content image.
Inspired by the common painting process of drawing a draft and revising the details, we introduce a novel feed-forward method named Laplacian Pyramid Network (LapStyle)
Our method can synthesize high quality stylized images in real time, where holistic style patterns are properly transferred.
arXiv Detail & Related papers (2021-04-12T11:53:53Z) - Geometric Style Transfer [74.58782301514053]
We introduce a neural architecture that supports transfer of geometric style.
New architecture runs prior to a network that transfers texture style.
Users can input content/style pair as is common, or they can chose to input a content/texture-style/geometry-style triple.
arXiv Detail & Related papers (2020-07-10T16:33:23Z) - Real-time Universal Style Transfer on High-resolution Images via
Zero-channel Pruning [74.09149955786367]
ArtNet can achieve universal, real-time, and high-quality style transfer on high-resolution images simultaneously.
By using ArtNet and S2, our method is 2.3 to 107.4 times faster than state-of-the-art approaches.
arXiv Detail & Related papers (2020-06-16T09:50:14Z) - Generating Embroidery Patterns Using Image-to-Image Translation [2.055949720959582]
We propose two machine learning techniques to solve the embroidery image-to-image translation.
Our goal is to generate a preview image which looks similar to an embroidered image, from a user-uploaded image.
Empirical results show that these techniques successfully generate an approximate preview of an embroidered version of a user image.
arXiv Detail & Related papers (2020-03-05T20:32:40Z) - Neural arbitrary style transfer for portrait images using the attention
mechanism [0.0]
Arbitrary style transfer is the task of synthesis of an image that has never been seen before.
In this paper, we consider an approach to solving this problem using the combined architecture of deep neural networks.
arXiv Detail & Related papers (2020-02-17T13:59:58Z) - P$^2$-GAN: Efficient Style Transfer Using Single Style Image [2.703193151632043]
Style transfer is a useful image synthesis technique that can re-render given image into another artistic style.
Generative Adversarial Network (GAN) is a widely adopted framework toward this task for its better representation ability on local style patterns.
In this paper, a novel Patch Permutation GAN (P$2$-GAN) network that can efficiently learn the stroke style from a single style image is proposed.
arXiv Detail & Related papers (2020-01-21T12:08:08Z) - Very Long Natural Scenery Image Prediction by Outpainting [96.8509015981031]
Outpainting receives less attention due to two challenges in it.
First challenge is how to keep the spatial and content consistency between generated images and original input.
Second challenge is how to maintain high quality in generated results.
arXiv Detail & Related papers (2019-12-29T16:29:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.