Neural Artistic Style and Color Transfer Using Deep Learning
- URL: http://arxiv.org/abs/2508.08608v1
- Date: Tue, 12 Aug 2025 03:42:03 GMT
- Title: Neural Artistic Style and Color Transfer Using Deep Learning
- Authors: Justin London,
- Abstract summary: We introduce a methodology that combines neural artistic style with color transfer.<n>The method uses the Kullback-Leibler divergence to quantitatively evaluate color and luminance histogram matching algorithms.<n>Various experiments are performed to evaluate the KL of these algorithms and their color histograms for style to content transfer.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural artistic style transfers and blends the content and style representation of one image with the style of another. This enables artists to create unique innovative visuals and enhances artistic expression in various fields including art, design, and film. Color transfer algorithms are an important in digital image processing by adjusting the color information in a target image based on the colors in the source image. Color transfer enhances images and videos in film and photography, and can aid in image correction. We introduce a methodology that combines neural artistic style with color transfer. The method uses the Kullback-Leibler (KL) divergence to quantitatively evaluate color and luminance histogram matching algorithms including Reinhard global color transfer, iteration distribution transfer (IDT), IDT with regrain, Cholesky, and PCA between the original and neural artistic style transferred image using deep learning. We estimate the color channel kernel densities. Various experiments are performed to evaluate the KL of these algorithms and their color histograms for style to content transfer.
Related papers
- Free-Lunch Color-Texture Disentanglement for Stylized Image Generation [58.406368812760256]
This paper introduces the first tuning-free approach to achieve free-lunch color-texture disentanglement in stylized T2I generation.<n>We develop techniques for separating and extracting Color-Texture Embeddings (CTE) from individual color and texture reference images.<n>To ensure that the color palette of the generated image aligns closely with the color reference, we apply a whitening and coloring transformation.
arXiv Detail & Related papers (2025-03-18T14:10:43Z) - U-StyDiT: Ultra-high Quality Artistic Style Transfer Using Diffusion Transformers [11.37321110116169]
We propose a novel artistic image style transfer method, U-StyDiT, built on transformer-based diffusion (DiT)<n>We first design a Multi-view Style Modulator (MSM) to learn style information from a style image from local and global perspectives.<n>Then, we introduce a StyDiT Block to learn content and style conditions simultaneously from a style image.
arXiv Detail & Related papers (2025-03-11T08:12:38Z) - Multiscale style transfer based on a Laplacian pyramid for traditional Chinese painting [6.248530911794617]
We present a novel effective multiscale style transfer method based on Laplacian pyramid decomposition and reconstruction.<n>In the first stage, the holistic patterns are transferred at low resolution by adopting a Style Transfer Base Network.<n>The details of the content and style are gradually enhanced at higher resolutions by a Detail Enhancement Network.
arXiv Detail & Related papers (2025-02-07T01:04:49Z) - NCST: Neural-based Color Style Transfer for Video Retouching [3.2050418539021774]
Video color style transfer aims to transform the color style of an original video by using a reference style image.
Most existing methods employ neural networks, which come with challenges like opaque transfer processes.
We introduce a method that predicts specific parameters for color style transfer using two images.
arXiv Detail & Related papers (2024-11-01T03:25:15Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region [66.75826549444909]
Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
arXiv Detail & Related papers (2023-09-07T06:27:39Z) - PalGAN: Image Colorization with Palette Generative Adversarial Networks [51.59276436217957]
We propose a new GAN-based colorization approach PalGAN, integrated with palette estimation and chromatic attention.
PalGAN outperforms state-of-the-arts in quantitative evaluation and visual comparison, delivering notable diverse, contrastive, and edge-preserving appearances.
arXiv Detail & Related papers (2022-10-20T12:28:31Z) - Learning Diverse Tone Styles for Image Retouching [73.60013618215328]
We propose to learn diverse image retouching with normalizing flow-based architectures.
A joint-training pipeline is composed of a style encoder, a conditional RetouchNet, and the image tone style normalizing flow (TSFlow) module.
Our proposed method performs favorably against state-of-the-art methods and is effective in generating diverse results.
arXiv Detail & Related papers (2022-07-12T09:49:21Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - Deep Line Art Video Colorization with a Few References [49.7139016311314]
We propose a deep architecture to automatically color line art videos with the same color style as the given reference images.
Our framework consists of a color transform network and a temporal constraint network.
Our model can achieve even better coloring results by fine-tuning the parameters with only a small amount of samples.
arXiv Detail & Related papers (2020-03-24T06:57:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.