Filter Style Transfer between Photos
- URL: http://arxiv.org/abs/2007.07925v1
- Date: Wed, 15 Jul 2020 18:09:35 GMT
- Title: Filter Style Transfer between Photos
- Authors: Jonghwa Yim, Jisung Yoo, Won-joon Do, Beomsu Kim, Jihwan Choe
- Abstract summary: Filter Style Transfer (FST) is first style transfer method that can transfer custom filter effects between FHD image under 2ms on a mobile device without any textual context loss.
- Score: 11.796188165729342
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Over the past few years, image-to-image style transfer has risen to the
frontiers of neural image processing. While conventional methods were
successful in various tasks such as color and texture transfer between images,
none could effectively work with the custom filter effects that are applied by
users through various platforms like Instagram. In this paper, we introduce a
new concept of style transfer, Filter Style Transfer (FST). Unlike conventional
style transfer, new technique FST can extract and transfer custom filter style
from a filtered style image to a content image. FST first infers the original
image from a filtered reference via image-to-image translation. Then it
estimates filter parameters from the difference between them. To resolve the
ill-posed nature of reconstructing the original image from the reference, we
represent each pixel color of an image to class mean and deviation. Besides, to
handle the intra-class color variation, we propose an uncertainty based
weighted least square method for restoring an original image. To the best of
our knowledge, FST is the first style transfer method that can transfer custom
filter effects between FHD image under 2ms on a mobile device without any
textual context loss.
Related papers
- Neural Style Transfer for Vector Graphics [3.8983556368110226]
Style transfer between vector images has not been considered.
Applying standard content and style losses insignificantly changes the vector image drawing style.
New method based on differentiableization can change the color and shape parameters of the content image corresponding to the drawing of the style image.
arXiv Detail & Related papers (2023-03-06T16:57:45Z) - Scaling Painting Style Transfer [10.059627473725508]
Neural style transfer (NST) is a technique that produces an unprecedentedly rich style transfer from a style image to a content image.
This paper presents a solution to solve the original global optimization for ultra-high resolution (UHR) images.
We show that our method produces style transfer of unmatched quality for such high-resolution painting styles.
arXiv Detail & Related papers (2022-12-27T12:03:38Z) - Artistic Arbitrary Style Transfer [1.1279808969568252]
Arbitrary Style Transfer is a technique used to produce a new image from two images: a content image, and a style image.
Balancing the structure and style components has been the major challenge that other state-of-the-art algorithms have tried to solve.
In this work, we solved these problems by using a Deep Learning approach using Convolutional Neural Networks.
arXiv Detail & Related papers (2022-12-21T21:34:00Z) - Learning Diverse Tone Styles for Image Retouching [73.60013618215328]
We propose to learn diverse image retouching with normalizing flow-based architectures.
A joint-training pipeline is composed of a style encoder, a conditional RetouchNet, and the image tone style normalizing flow (TSFlow) module.
Our proposed method performs favorably against state-of-the-art methods and is effective in generating diverse results.
arXiv Detail & Related papers (2022-07-12T09:49:21Z) - Patch-wise Contrastive Style Learning for Instagram Filter Removal [3.867363075280544]
Social media filters are one of the most common resources of various corruptions and perturbations for real-world visual analysis applications.
We introduce Contrastive Instagram Filter Removal Network (CIFR), which enhances this idea for Instagram filter removal by employing a novel multi-layer patch-wise contrastive style learning mechanism.
arXiv Detail & Related papers (2022-04-15T14:38:28Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - Saliency Constrained Arbitrary Image Style Transfer using SIFT and DCNN [22.57205921266602]
When common neural style transfer methods are used, the textures and colors in the style image are usually transferred imperfectly to the content image.
This paper proposes a novel saliency constrained method to reduce or avoid such effects.
The experiments show that the saliency maps of source images can help find the correct matching and avoid artifacts.
arXiv Detail & Related papers (2022-01-14T09:00:55Z) - StyTr^2: Unbiased Image Style Transfer with Transformers [59.34108877969477]
The goal of image style transfer is to render an image with artistic features guided by a style reference while maintaining the original content.
Traditional neural style transfer methods are usually biased and content leak can be observed by running several times of the style transfer process with the same reference image.
We propose a transformer-based approach, namely StyTr2, to address this critical issue.
arXiv Detail & Related papers (2021-05-30T15:57:09Z) - SAFIN: Arbitrary Style Transfer With Self-Attentive Factorized Instance
Normalization [71.85169368997738]
Artistic style transfer aims to transfer the style characteristics of one image onto another image while retaining its content.
Self-Attention-based approaches have tackled this issue with partial success but suffer from unwanted artifacts.
This paper aims to combine the best of both worlds: self-attention and normalization.
arXiv Detail & Related papers (2021-05-13T08:01:01Z) - ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows [101.16791104543492]
ArtFlow is proposed to prevent content leak during universal style transfer.
It supports both forward and backward inferences and operates in a projection-transfer-reversion scheme.
It achieves comparable performance to state-of-the-art style transfer methods while avoiding content leak.
arXiv Detail & Related papers (2021-03-31T07:59:02Z) - Geometric Style Transfer [74.58782301514053]
We introduce a neural architecture that supports transfer of geometric style.
New architecture runs prior to a network that transfers texture style.
Users can input content/style pair as is common, or they can chose to input a content/texture-style/geometry-style triple.
arXiv Detail & Related papers (2020-07-10T16:33:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.