Inverse Painting: Reconstructing The Painting Process
- URL: http://arxiv.org/abs/2409.20556v2
- Date: Fri, 11 Oct 2024 18:57:36 GMT
- Title: Inverse Painting: Reconstructing The Painting Process
- Authors: Bowei Chen, Yifan Wang, Brian Curless, Ira Kemelmacher-Shlizerman, Steven M. Seitz,
- Abstract summary: We formulate this as an autoregressive image generation problem, in which an initially blank "canvas" is iteratively updated.
The model learns from real artists by training on many painting videos.
- Score: 24.57538165449989
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Given an input painting, we reconstruct a time-lapse video of how it may have been painted. We formulate this as an autoregressive image generation problem, in which an initially blank "canvas" is iteratively updated. The model learns from real artists by training on many painting videos. Our approach incorporates text and region understanding to define a set of painting "instructions" and updates the canvas with a novel diffusion-based renderer. The method extrapolates beyond the limited, acrylic style paintings on which it has been trained, showing plausible results for a wide range of artistic styles and genres.
Related papers
- ProcessPainter: Learn Painting Process from Sequence Data [27.9875429986135]
The painting process of artists is inherently stepwise and varies significantly among different painters and styles.
Traditional stroke-based rendering methods break down images into sequences of brushstrokes, yet they fall short of replicating the authentic processes of artists.
We introduce ProcessPainter, a text-to-video model that is initially pre-trained on synthetic data and subsequently fine-tuned with a select set of artists' painting sequences.
arXiv Detail & Related papers (2024-06-10T07:18:41Z) - Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region [66.75826549444909]
Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
arXiv Detail & Related papers (2023-09-07T06:27:39Z) - Reference-based Painterly Inpainting via Diffusion: Crossing the Wild
Reference Domain Gap [80.19252970827552]
RefPaint is a novel task that crosses the wild reference domain gap and implants novel objects into artworks.
Our method enables creative painterly image inpainting with reference objects that would otherwise be difficult to achieve.
arXiv Detail & Related papers (2023-07-20T04:51:10Z) - Inversion-Based Style Transfer with Diffusion Models [78.93863016223858]
Previous arbitrary example-guided artistic image generation methods often fail to control shape changes or convey elements.
We propose an inversion-based style transfer method (InST), which can efficiently and accurately learn the key information of an image.
arXiv Detail & Related papers (2022-11-23T18:44:25Z) - Perceptual Artifacts Localization for Inpainting [60.5659086595901]
We propose a new learning task of automatic segmentation of inpainting perceptual artifacts.
We train advanced segmentation networks on a dataset to reliably localize inpainting artifacts within inpainted images.
We also propose a new evaluation metric called Perceptual Artifact Ratio (PAR), which is the ratio of objectionable inpainted regions to the entire inpainted area.
arXiv Detail & Related papers (2022-08-05T18:50:51Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - Intelli-Paint: Towards Developing Human-like Painting Agents [19.261822105543175]
We propose a novel painting approach which learns to generate output canvases while exhibiting a more human-like painting style.
Intelli-Paint consists of 1) a progressive layering strategy which allows the agent to first paint a natural background scene representation before adding in each of the foreground objects in a progressive fashion.
We also introduce a novel sequential brushstroke guidance strategy which helps the painting agent to shift its attention between different image regions in a semantic-aware manner.
arXiv Detail & Related papers (2021-12-16T14:56:32Z) - The Joy of Neural Painting [0.0]
We train a class of models that follows a GAN framework to generate brushstrokes, which are then composed to create paintings.
To overcome GAN's limitations and to speed up the Neural Painter training, we applied Transfer Learning to the process reducing it from days to only hours.
arXiv Detail & Related papers (2021-11-19T15:44:10Z) - In&Out : Diverse Image Outpainting via GAN Inversion [89.84841983778672]
Image outpainting seeks for a semantically consistent extension of the input image beyond its available content.
In this work, we formulate the problem from the perspective of inverting generative adversarial networks.
Our generator renders micro-patches conditioned on their joint latent code as well as their individual positions in the image.
arXiv Detail & Related papers (2021-04-01T17:59:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.