PTGCF: Printing Texture Guided Color Fusion for Impressionism Oil
Painting Style Rendering
- URL: http://arxiv.org/abs/2207.12585v2
- Date: Wed, 27 Jul 2022 10:12:12 GMT
- Title: PTGCF: Printing Texture Guided Color Fusion for Impressionism Oil
Painting Style Rendering
- Authors: Jing Geng, Li'e Ma, Xiaoquan Li, Yijun Yan
- Abstract summary: The extraction of style information such as stroke texture and color of the target style image is the key to image stylization.
A new stroke rendering method is proposed, which fully considers the tonal characteristics and the representative color of the original oil painting image.
The experiments have validated the efficacy of the proposed model.
- Score: 0.3249853429482705
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As a major branch of Non-Photorealistic Rendering (NPR), image stylization
mainly uses the computer algorithms to render a photo into an artistic
painting. Recent work has shown that the extraction of style information such
as stroke texture and color of the target style image is the key to image
stylization. Given its stroke texture and color characteristics, a new stroke
rendering method is proposed, which fully considers the tonal characteristics
and the representative color of the original oil painting, in order to fit the
tone of the original oil painting image into the stylized image and make it
close to the artist's creative effect. The experiments have validated the
efficacy of the proposed model. This method would be more suitable for the
works of pointillism painters with a relatively uniform sense of direction,
especially for natural scenes. When the original painting brush strokes have a
clearer sense of direction, using this method to simulate brushwork texture
features can be less satisfactory.
Related papers
- Free-Lunch Color-Texture Disentanglement for Stylized Image Generation [58.406368812760256]
This paper introduces the first tuning-free approach to achieve free-lunch color-texture disentanglement in stylized T2I generation.
We develop techniques for separating and extracting Color-Texture Embeddings (CTE) from individual color and texture reference images.
To ensure that the color palette of the generated image aligns closely with the color reference, we apply a whitening and coloring transformation.
arXiv Detail & Related papers (2025-03-18T14:10:43Z) - Emergence of Painting Ability via Recognition-Driven Evolution [49.666177849272856]
We present a model with a stroke branch and a palette branch that together simulate human-like painting.
We quantify the efficiency of visual communication by measuring the recognition accuracy achieved with machine vision.
Experimental results show that our model achieves superior performance in high-level recognition tasks.
arXiv Detail & Related papers (2025-01-09T04:37:31Z) - Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region [66.75826549444909]
Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
arXiv Detail & Related papers (2023-09-07T06:27:39Z) - Controlling Geometric Abstraction and Texture for Artistic Images [0.22835610890984162]
We present a novel method for the interactive control of geometric abstraction and texture in artistic images.
Previous example-based stylization methods often entangle shape, texture, and color, while generative methods for image synthesis generally make assumptions about the input image.
By contrast, our holistic approach spatially decomposes the input into shapes and a parametric representation of high-frequency details comprising the image's texture, thus enabling independent control of color and texture.
arXiv Detail & Related papers (2023-07-31T20:37:43Z) - Inversion-Based Style Transfer with Diffusion Models [78.93863016223858]
Previous arbitrary example-guided artistic image generation methods often fail to control shape changes or convey elements.
We propose an inversion-based style transfer method (InST), which can efficiently and accurately learn the key information of an image.
arXiv Detail & Related papers (2022-11-23T18:44:25Z) - Perceptual Artifacts Localization for Inpainting [60.5659086595901]
We propose a new learning task of automatic segmentation of inpainting perceptual artifacts.
We train advanced segmentation networks on a dataset to reliably localize inpainting artifacts within inpainted images.
We also propose a new evaluation metric called Perceptual Artifact Ratio (PAR), which is the ratio of objectionable inpainted regions to the entire inpainted area.
arXiv Detail & Related papers (2022-08-05T18:50:51Z) - ARF: Artistic Radiance Fields [63.79314417413371]
We present a method for transferring the artistic features of an arbitrary style image to a 3D scene.
Previous methods that perform 3D stylization on point clouds or meshes are sensitive to geometric reconstruction errors.
We propose to stylize the more robust radiance field representation.
arXiv Detail & Related papers (2022-06-13T17:55:31Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - Texture for Colors: Natural Representations of Colors Using Variable
Bit-Depth Textures [13.180922099929765]
We present an automated method to transform an image to a set of binary textures that represent not only the intensities, but also the colors of the original.
The system yields aesthetically pleasing binary images when tested on a variety of image sources.
arXiv Detail & Related papers (2021-05-04T21:22:02Z) - A deep learning based interactive sketching system for fashion images
design [47.09122395308728]
We propose an interactive system to design diverse high-quality garment images from fashion sketches and the texture information.
The major challenge behind this system is to generate high-quality and detailed texture according to the user-provided texture information.
In particular, we propose a novel bi-colored edge texture representation to synthesize textured garment images and a shading enhancer to render shading based on the grayscale edges.
arXiv Detail & Related papers (2020-10-09T07:50:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.