Birth of a Painting: Differentiable Brushstroke Reconstruction
- URL: http://arxiv.org/abs/2511.13191v1
- Date: Mon, 17 Nov 2025 09:55:53 GMT
- Title: Birth of a Painting: Differentiable Brushstroke Reconstruction
- Authors: Ying Jiang, Jiayin Lu, Yunuo Chen, Yumeng He, Kui Wu, Yin Yang, Chenfanfu Jiang,
- Abstract summary: Painting embodies a unique form of visual storytelling, where the creation process is as significant as the final artwork.<n>Our approach produces realistic and stylized appearances, offering a unified model for digital painting.
- Score: 25.61763988336406
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Painting embodies a unique form of visual storytelling, where the creation process is as significant as the final artwork. Although recent advances in generative models have enabled visually compelling painting synthesis, most existing methods focus solely on final image generation or patch-based process simulation, lacking explicit stroke structure and failing to produce smooth, realistic shading. In this work, we present a differentiable stroke reconstruction framework that unifies painting, stylized texturing, and smudging to faithfully reproduce the human painting-smudging loop. Given an input image, our framework first optimizes single- and dual-color Bezier strokes through a parallel differentiable paint renderer, followed by a style generation module that synthesizes geometry-conditioned textures across diverse painting styles. We further introduce a differentiable smudge operator to enable natural color blending and shading. Coupled with a coarse-to-fine optimization strategy, our method jointly optimizes stroke geometry, color, and texture under geometric and semantic guidance. Extensive experiments on oil, watercolor, ink, and digital paintings demonstrate that our approach produces realistic and expressive stroke reconstructions, smooth tonal transitions, and richly stylized appearances, offering a unified model for expressive digital painting creation. See our project page for more demos: https://yingjiang96.github.io/DiffPaintWebsite/.
Related papers
- PaintFlow: A Unified Framework for Interactive Oil Paintings Editing and Generation [47.72342715926692]
Oil painting is a high-level medium that blends human abstract thinking with artistic expression.<n>Existing generation and editing techniques are often constrained by the distribution of training data.<n>We introduce a unified multimodal framework for oil painting generation and editing.
arXiv Detail & Related papers (2025-12-09T12:31:00Z) - Loomis Painter: Reconstructing the Painting Process [56.713812157283805]
Step-by-step painting tutorials are vital for learning artistic techniques, but existing video resources lack interactivity and personalization.<n>We propose a unified framework for multi-media painting process generation with a semantics-driven style control mechanism.<n>We also build a large-scale dataset of real painting processes and evaluate cross-media consistency, temporal coherence, and final-image fidelity.
arXiv Detail & Related papers (2025-11-21T16:06:32Z) - Vectorized Region Based Brush Strokes for Artistic Rendering [3.5297361401370044]
Recent stroke-based painting systems focus on capturing stroke details by predicting and iteratively refining stroke parameters.<n>These methods often struggle to produce stroke compositions that align with artistic principles and intent.<n>We propose an image-to-painting method that (i) facilitates semantic guidance for brush strokes in targeted regions, (ii) computes the brush stroke parameters, and (iii) establishes a sequence among segments and strokes to sequentially render the final painting.
arXiv Detail & Related papers (2025-06-11T17:45:36Z) - Emergence of Painting Ability via Recognition-Driven Evolution [49.666177849272856]
We present a model with a stroke branch and a palette branch that together simulate human-like painting.<n>We quantify the efficiency of visual communication by measuring the recognition accuracy achieved with machine vision.<n> Experimental results show that our model achieves superior performance in high-level recognition tasks.
arXiv Detail & Related papers (2025-01-09T04:37:31Z) - CreativeSynth: Cross-Art-Attention for Artistic Image Synthesis with Multimodal Diffusion [73.08710648258985]
Key painting attributes including layout, perspective, shape, and semantics often cannot be conveyed and expressed through style transfer.<n>Large-scale pretrained text-to-image generation models have demonstrated their capability to synthesize a vast amount of high-quality images.<n>Our main novel idea is to integrate multimodal semantic information as a synthesis guide into artworks, rather than transferring style to the real world.
arXiv Detail & Related papers (2024-01-25T10:42:09Z) - Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region [66.75826549444909]
Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
arXiv Detail & Related papers (2023-09-07T06:27:39Z) - Inversion-Based Style Transfer with Diffusion Models [78.93863016223858]
Previous arbitrary example-guided artistic image generation methods often fail to control shape changes or convey elements.
We propose an inversion-based style transfer method (InST), which can efficiently and accurately learn the key information of an image.
arXiv Detail & Related papers (2022-11-23T18:44:25Z) - Perceptual Artifacts Localization for Inpainting [60.5659086595901]
We propose a new learning task of automatic segmentation of inpainting perceptual artifacts.
We train advanced segmentation networks on a dataset to reliably localize inpainting artifacts within inpainted images.
We also propose a new evaluation metric called Perceptual Artifact Ratio (PAR), which is the ratio of objectionable inpainted regions to the entire inpainted area.
arXiv Detail & Related papers (2022-08-05T18:50:51Z) - PTGCF: Printing Texture Guided Color Fusion for Impressionism Oil
Painting Style Rendering [0.3249853429482705]
The extraction of style information such as stroke texture and color of the target style image is the key to image stylization.
A new stroke rendering method is proposed, which fully considers the tonal characteristics and the representative color of the original oil painting image.
The experiments have validated the efficacy of the proposed model.
arXiv Detail & Related papers (2022-07-26T00:31:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.