Collaborative Neural Painting
- URL: http://arxiv.org/abs/2312.01800v1
- Date: Mon, 4 Dec 2023 10:45:12 GMT
- Title: Collaborative Neural Painting
- Authors: Nicola Dall'Asen, Willi Menapace, Elia Peruzzo, Enver Sangineto,
Yiming Wang, Elisa Ricci
- Abstract summary: We introduce a novel task, Collaborative Neural Painting (CNP), to facilitate collaborative art painting generation between humans and machines.
CNP should produce a sequence of strokes supporting the completion of a coherent painting.
We propose a painting representation based on a sequence of parametrized strokes, which makes it easy both editing and composition operations.
- Score: 27.880814775833578
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The process of painting fosters creativity and rational planning. However,
existing generative AI mostly focuses on producing visually pleasant artworks,
without emphasizing the painting process. We introduce a novel task,
Collaborative Neural Painting (CNP), to facilitate collaborative art painting
generation between humans and machines. Given any number of user-input
brushstrokes as the context or just the desired object class, CNP should
produce a sequence of strokes supporting the completion of a coherent painting.
Importantly, the process can be gradual and iterative, so allowing users'
modifications at any phase until the completion. Moreover, we propose to solve
this task using a painting representation based on a sequence of parametrized
strokes, which makes it easy both editing and composition operations. These
parametrized strokes are processed by a Transformer-based architecture with a
novel attention mechanism to model the relationship between the input strokes
and the strokes to complete. We also propose a new masking scheme to reflect
the interactive nature of CNP and adopt diffusion models as the basic learning
process for its effectiveness and diversity in the generative field. Finally,
to develop and validate methods on the novel task, we introduce a new dataset
of painted objects and an evaluation protocol to benchmark CNP both
quantitatively and qualitatively. We demonstrate the effectiveness of our
approach and the potential of the CNP task as a promising avenue for future
research.
Related papers
- AttentionPainter: An Efficient and Adaptive Stroke Predictor for Scene Painting [82.54770866332456]
Stroke-based Rendering (SBR) aims to decompose an input image into a sequence of parameterized strokes, which can be rendered into a painting that resembles the input image.
We propose AttentionPainter, an efficient and adaptive model for single-step neural painting.
arXiv Detail & Related papers (2024-10-21T18:36:45Z) - Artistic Intelligence: A Diffusion-Based Framework for High-Fidelity Landscape Painting Synthesis [2.205829309604458]
LPGen is a novel diffusion-based model specifically designed for landscape painting generation.
LPGen introduces a decoupled cross-attention mechanism that independently processes structural and stylistic features.
The model is pre-trained on a curated dataset of high-resolution landscape images, categorized by distinct artistic styles, and then fine-tuned to ensure detailed and consistent output.
arXiv Detail & Related papers (2024-07-24T12:32:24Z) - An Inpainting-Infused Pipeline for Attire and Background Replacement [0.0]
We explore an integrated approach, leveraging advanced techniques in GenAI and computer vision emphasizing image manipulation.
The methodology unfolds through several stages, including depth estimation, the generation and replacement of backgrounds.
Experiments conducted in this study underscore the methodology's efficacy, highlighting its potential to produce visually captivating content.
arXiv Detail & Related papers (2024-02-05T20:34:32Z) - Image Inpainting via Tractable Steering of Diffusion Models [54.13818673257381]
This paper proposes to exploit the ability of Tractable Probabilistic Models (TPMs) to exactly and efficiently compute the constrained posterior.
Specifically, this paper adopts a class of expressive TPMs termed Probabilistic Circuits (PCs)
We show that our approach can consistently improve the overall quality and semantic coherence of inpainted images with only 10% additional computational overhead.
arXiv Detail & Related papers (2023-11-28T21:14:02Z) - Interactive Neural Painting [66.9376011879115]
This paper proposes the first approach for Interactive Neural Painting (NP)
We propose I-Paint, a novel method based on a conditional transformer Variational AutoEncoder (VAE) architecture with a two-stage decoder.
Our experiments show that our approach provides good stroke suggestions and compares favorably to the state of the art.
arXiv Detail & Related papers (2023-07-31T07:02:00Z) - Modeling Image Composition for Complex Scene Generation [77.10533862854706]
We present a method that achieves state-of-the-art results on layout-to-image generation tasks.
After compressing RGB images into patch tokens, we propose the Transformer with Focal Attention (TwFA) for exploring dependencies of object-to-object, object-to-patch and patch-to-patch.
arXiv Detail & Related papers (2022-06-02T08:34:25Z) - Toward Modeling Creative Processes for Algorithmic Painting [12.602935529346063]
The paper argues that creative processes often involve two important components: vague, high-level goals and exploratory processes for discovering new ideas.
This paper sketches out possible computational mechanisms for imitating those elements of the painting process, including underspecified loss functions and iterative painting procedures with explicit task decompositions.
arXiv Detail & Related papers (2022-05-03T16:33:45Z) - Modeling Artistic Workflows for Image Generation and Editing [83.43047077223947]
We propose a generative model that follows a given artistic workflow.
It enables both multi-stage image generation as well as multi-stage image editing of an existing piece of art.
arXiv Detail & Related papers (2020-07-14T17:54:26Z) - Head2Head: Video-based Neural Head Synthesis [50.32988828989691]
We propose a novel machine learning architecture for facial reenactment.
We show that the proposed method can transfer facial expressions, pose and gaze of a source actor to a target video in a photo-realistic fashion more accurately than state-of-the-art methods.
arXiv Detail & Related papers (2020-05-22T00:44:43Z) - Interactive Neural Style Transfer with Artists [6.130486652666935]
We present interactive painting processes in which a painter and various neural style transfer algorithms interact on a real canvas.
We gather a set of paired painting-pictures images and present a new evaluation methodology based on the predictivity of neural style transfer algorithms.
arXiv Detail & Related papers (2020-03-14T15:27:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.