Deep Preset: Blending and Retouching Photos with Color Style Transfer
- URL: http://arxiv.org/abs/2007.10701v2
- Date: Sat, 2 Jan 2021 10:53:45 GMT
- Title: Deep Preset: Blending and Retouching Photos with Color Style Transfer
- Authors: Man M. Ho, Jinjia Zhou
- Abstract summary: We focus on learning low-level image transformation, especially color-shifting methods, then present a novel scheme to train color style transfer with ground-truth.
It is designed to 1) generalize the features representing the color transformation from content with natural colors to retouched reference, then blend it into the contextual features of content.
We script Lightroom, a powerful tool in editing photos, to generate 600,000 training samples using 1,200 images from the Flick2K dataset and 500 user-generated presets with 69 settings.
- Score: 15.95010869939508
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: End-users, without knowledge in photography, desire to beautify their photos
to have a similar color style as a well-retouched reference. However, the
definition of style in recent image style transfer works is inappropriate. They
usually synthesize undesirable results due to transferring exact colors to the
wrong destination. It becomes even worse in sensitive cases such as portraits.
In this work, we concentrate on learning low-level image transformation,
especially color-shifting methods, rather than mixing contextual features, then
present a novel scheme to train color style transfer with ground-truth.
Furthermore, we propose a color style transfer named Deep Preset. It is
designed to 1) generalize the features representing the color transformation
from content with natural colors to retouched reference, then blend it into the
contextual features of content, 2) predict hyper-parameters (settings or
preset) of the applied low-level color transformation methods, 3) stylize
content to have a similar color style as reference. We script Lightroom, a
powerful tool in editing photos, to generate 600,000 training samples using
1,200 images from the Flick2K dataset and 500 user-generated presets with 69
settings. Experimental results show that our Deep Preset outperforms the
previous works in color style transfer quantitatively and qualitatively.
Related papers
- NCST: Neural-based Color Style Transfer for Video Retouching [3.2050418539021774]
Video color style transfer aims to transform the color style of an original video by using a reference style image.
Most existing methods employ neural networks, which come with challenges like opaque transfer processes.
We introduce a method that predicts specific parameters for color style transfer using two images.
arXiv Detail & Related papers (2024-11-01T03:25:15Z) - NamedCurves: Learned Image Enhancement via Color Naming [35.01034487051896]
We propose NamedCurves, a learning-based image enhancement technique that separates the image into a small set of named colors.
Our method learns to globally adjust the image for each specific named color via tone curves and then combines the images using an attention-based fusion mechanism to mimic spatial editing.
arXiv Detail & Related papers (2024-07-13T13:51:25Z) - Palette-based Color Transfer between Images [9.471264982229508]
We propose a new palette-based color transfer method that can automatically generate a new color scheme.
With a redesigned palette-based clustering method, pixels can be classified into different segments according to color distribution.
Our method exhibits significant advantages over peer methods in terms of natural realism, color consistency, generality, and robustness.
arXiv Detail & Related papers (2024-05-14T01:41:19Z) - Automatic Controllable Colorization via Imagination [55.489416987587305]
We propose a framework for automatic colorization that allows for iterative editing and modifications.
By understanding the content within a grayscale image, we utilize a pre-trained image generation model to generate multiple images that contain the same content.
These images serve as references for coloring, mimicking the process of human experts.
arXiv Detail & Related papers (2024-04-08T16:46:07Z) - Dequantization and Color Transfer with Diffusion Models [5.228564799458042]
quantized images offer easy abstraction for patch-based edits and palette transfer.
We show that our model can generate natural images that respect the color palette the user asked for.
Our method can be usefully extended to another practical edit: recoloring patches of an image while respecting the source texture.
arXiv Detail & Related papers (2023-07-06T00:07:32Z) - Any-to-Any Style Transfer: Making Picasso and Da Vinci Collaborate [58.83278629019384]
Style transfer aims to render the style of a given image for style reference to another given image for content reference.
Existing approaches either apply the holistic style of the style image in a global manner, or migrate local colors and textures of the style image to the content counterparts in a pre-defined way.
We propose Any-to-Any Style Transfer, which enables users to interactively select styles of regions in the style image and apply them to the prescribed content regions.
arXiv Detail & Related papers (2023-04-19T15:15:36Z) - Neural Preset for Color Style Transfer [46.66925849502683]
We present a Neural Preset technique to address the limitations of existing color style transfer methods.
Our method is based on two core designs. First, we propose Deterministic Neural Color Mapping (DNCM) to consistently operate on each pixel.
Second, we develop a two-stage pipeline by dividing the task into color normalization and stylization.
arXiv Detail & Related papers (2023-03-23T17:59:10Z) - Learning Diverse Tone Styles for Image Retouching [73.60013618215328]
We propose to learn diverse image retouching with normalizing flow-based architectures.
A joint-training pipeline is composed of a style encoder, a conditional RetouchNet, and the image tone style normalizing flow (TSFlow) module.
Our proposed method performs favorably against state-of-the-art methods and is effective in generating diverse results.
arXiv Detail & Related papers (2022-07-12T09:49:21Z) - CAMS: Color-Aware Multi-Style Transfer [46.550390398057985]
Style transfer aims to manipulate the appearance of a source image, or "content" image, to share similar texture and colors of a target "style" image.
A commonly used approach to assist in transferring styles is based on Gram matrix optimization.
We propose a color-aware multi-style transfer method that generates aesthetically pleasing results while preserving the style-color correlation between style and generated images.
arXiv Detail & Related papers (2021-06-26T01:15:09Z) - Deep Line Art Video Colorization with a Few References [49.7139016311314]
We propose a deep architecture to automatically color line art videos with the same color style as the given reference images.
Our framework consists of a color transform network and a temporal constraint network.
Our model can achieve even better coloring results by fine-tuning the parameters with only a small amount of samples.
arXiv Detail & Related papers (2020-03-24T06:57:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.