Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region
- URL: http://arxiv.org/abs/2309.03504v2
- Date: Tue, 10 Oct 2023 09:01:57 GMT
- Title: Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region
- Authors: Teng Hu, Ran Yi, Haokun Zhu, Liang Liu, Jinlong Peng, Yabiao Wang,
Chengjie Wang, Lizhuang Ma
- Abstract summary: Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
- Score: 66.75826549444909
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stroke-based rendering aims to recreate an image with a set of strokes. Most
existing methods render complex images using an uniform-block-dividing
strategy, which leads to boundary inconsistency artifacts. To solve the
problem, we propose Compositional Neural Painter, a novel stroke-based
rendering framework which dynamically predicts the next painting region based
on the current canvas, instead of dividing the image plane uniformly into
painting regions. We start from an empty canvas and divide the painting process
into several steps. At each step, a compositor network trained with a phasic RL
strategy first predicts the next painting region, then a painter network
trained with a WGAN discriminator predicts stroke parameters, and a stroke
renderer paints the strokes onto the painting region of the current canvas.
Moreover, we extend our method to stroke-based style transfer with a novel
differentiable distance transform loss, which helps preserve the structure of
the input image during stroke-based stylization. Extensive experiments show our
model outperforms the existing models in both stroke-based neural painting and
stroke-based stylization. Code is available at
https://github.com/sjtuplayer/Compositional_Neural_Painter
Related papers
- AttentionPainter: An Efficient and Adaptive Stroke Predictor for Scene Painting [82.54770866332456]
Stroke-based Rendering (SBR) aims to decompose an input image into a sequence of parameterized strokes, which can be rendered into a painting that resembles the input image.
We propose AttentionPainter, an efficient and adaptive model for single-step neural painting.
arXiv Detail & Related papers (2024-10-21T18:36:45Z) - Inverse Painting: Reconstructing The Painting Process [24.57538165449989]
We formulate this as an autoregressive image generation problem, in which an initially blank "canvas" is iteratively updated.
The model learns from real artists by training on many painting videos.
arXiv Detail & Related papers (2024-09-30T17:56:52Z) - Sketch-guided Image Inpainting with Partial Discrete Diffusion Process [5.005162730122933]
We introduce a novel partial discrete diffusion process (PDDP) for sketch-guided inpainting.
PDDP corrupts the masked regions of the image and reconstructs these masked regions conditioned on hand-drawn sketches.
The proposed novel transformer module accepts two inputs -- the image containing the masked region to be inpainted and the query sketch to model the reverse diffusion process.
arXiv Detail & Related papers (2024-04-18T07:07:38Z) - Segmentation-Based Parametric Painting [22.967620358813214]
We introduce a novel image-to-painting method that facilitates the creation of large-scale, high-fidelity paintings with human-like quality and stylistic variation.
We introduce a segmentation-based painting process and a dynamic attention map approach inspired by human painting strategies.
Our optimized batch processing and patch-based loss framework enable efficient handling of large canvases.
arXiv Detail & Related papers (2023-11-24T04:15:10Z) - Perceptual Artifacts Localization for Inpainting [60.5659086595901]
We propose a new learning task of automatic segmentation of inpainting perceptual artifacts.
We train advanced segmentation networks on a dataset to reliably localize inpainting artifacts within inpainted images.
We also propose a new evaluation metric called Perceptual Artifact Ratio (PAR), which is the ratio of objectionable inpainted regions to the entire inpainted area.
arXiv Detail & Related papers (2022-08-05T18:50:51Z) - Cylin-Painting: Seamless {360\textdegree} Panoramic Image Outpainting
and Beyond [136.18504104345453]
We present a Cylin-Painting framework that involves meaningful collaborations between inpainting and outpainting.
The proposed algorithm can be effectively extended to other panoramic vision tasks, such as object detection, depth estimation, and image super-resolution.
arXiv Detail & Related papers (2022-04-18T21:18:49Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - Neural Re-Rendering of Humans from a Single Image [80.53438609047896]
We propose a new method for neural re-rendering of a human under a novel user-defined pose and viewpoint.
Our algorithm represents body pose and shape as a parametric mesh which can be reconstructed from a single image.
arXiv Detail & Related papers (2021-01-11T18:53:47Z) - Stylized Neural Painting [0.0]
This paper proposes an image-to-painting translation method that generates vivid and realistic painting artworks with controllable styles.
Experiments show that the paintings generated by our method have a high degree of fidelity in both global appearance and local textures.
arXiv Detail & Related papers (2020-11-16T17:24:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.