Artistic Style in Robotic Painting; a Machine Learning Approach to
Learning Brushstroke from Human Artists
- URL: http://arxiv.org/abs/2007.03647v2
- Date: Tue, 28 Jul 2020 04:05:51 GMT
- Title: Artistic Style in Robotic Painting; a Machine Learning Approach to
Learning Brushstroke from Human Artists
- Authors: Ardavan Bidgoli, Manuel Ladron De Guevara, Cinnie Hsiung, Jean Oh,
Eunsu Kang
- Abstract summary: We propose a method to integrate an artistic style to the brushstrokes and the painting process through collaboration with a human artist.
In a preliminary study, 71% of human evaluators find our reconstructed brushstrokes are pertaining to the characteristics of the artist's style.
- Score: 7.906207218788341
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Robotic painting has been a subject of interest among both artists and
roboticists since the 1970s. Researchers and interdisciplinary artists have
employed various painting techniques and human-robot collaboration models to
create visual mediums on canvas. One of the challenges of robotic painting is
to apply a desired artistic style to the painting. Style transfer techniques
with machine learning models have helped us address this challenge with the
visual style of a specific painting. However, other manual elements of style,
i.e., painting techniques and brushstrokes of an artist, have not been fully
addressed. We propose a method to integrate an artistic style to the
brushstrokes and the painting process through collaboration with a human
artist. In this paper, we describe our approach to 1) collect brushstrokes and
hand-brush motion samples from an artist, and 2) train a generative model to
generate brushstrokes that pertains to the artist's style, and 3) fine tune a
stroke-based rendering model to work with our robotic painting setup. We will
report on the integration of these three steps in a separate publication. In a
preliminary study, 71% of human evaluators find our reconstructed brushstrokes
are pertaining to the characteristics of the artist's style. Moreover, 58% of
participants could not distinguish a painting made by our method from a
visually similar painting created by a human artist.
Related papers
- Inverse Painting: Reconstructing The Painting Process [24.57538165449989]
We formulate this as an autoregressive image generation problem, in which an initially blank "canvas" is iteratively updated.
The model learns from real artists by training on many painting videos.
arXiv Detail & Related papers (2024-09-30T17:56:52Z) - ProcessPainter: Learn Painting Process from Sequence Data [27.9875429986135]
The painting process of artists is inherently stepwise and varies significantly among different painters and styles.
Traditional stroke-based rendering methods break down images into sequences of brushstrokes, yet they fall short of replicating the authentic processes of artists.
We introduce ProcessPainter, a text-to-video model that is initially pre-trained on synthetic data and subsequently fine-tuned with a select set of artists' painting sequences.
arXiv Detail & Related papers (2024-06-10T07:18:41Z) - Rethinking Artistic Copyright Infringements in the Era of Text-to-Image Generative Models [47.19481598385283]
ArtSavant is a tool to determine the unique style of an artist by comparing it to a reference dataset of works from WikiArt.
We then perform a large-scale empirical study to provide quantitative insight on the prevalence of artistic style copying across 3 popular text-to-image generative models.
arXiv Detail & Related papers (2024-04-11T17:59:43Z) - Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region [66.75826549444909]
Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
arXiv Detail & Related papers (2023-09-07T06:27:39Z) - Interactive Neural Painting [66.9376011879115]
This paper proposes the first approach for Interactive Neural Painting (NP)
We propose I-Paint, a novel method based on a conditional transformer Variational AutoEncoder (VAE) architecture with a two-stage decoder.
Our experiments show that our approach provides good stroke suggestions and compares favorably to the state of the art.
arXiv Detail & Related papers (2023-07-31T07:02:00Z) - Inventing art styles with no artistic training data [0.65268245109828]
We propose two procedures to create painting styles using models trained only on natural images.
In the first procedure we use the inductive bias from the artistic medium to achieve creative expression.
The second procedure uses an additional natural image as inspiration to create a new style.
arXiv Detail & Related papers (2023-05-19T21:59:23Z) - Learning to Evaluate the Artness of AI-generated Images [64.48229009396186]
ArtScore is a metric designed to evaluate the degree to which an image resembles authentic artworks by artists.
We employ pre-trained models for photo and artwork generation, resulting in a series of mixed models.
This dataset is then employed to train a neural network that learns to estimate quantized artness levels of arbitrary images.
arXiv Detail & Related papers (2023-05-08T17:58:27Z) - Inversion-Based Style Transfer with Diffusion Models [78.93863016223858]
Previous arbitrary example-guided artistic image generation methods often fail to control shape changes or convey elements.
We propose an inversion-based style transfer method (InST), which can efficiently and accurately learn the key information of an image.
arXiv Detail & Related papers (2022-11-23T18:44:25Z) - Art Creation with Multi-Conditional StyleGANs [81.72047414190482]
A human artist needs a combination of unique skills, understanding, and genuine intention to create artworks that evoke deep feelings and emotions.
We introduce a multi-conditional Generative Adversarial Network (GAN) approach trained on large amounts of human paintings to synthesize realistic-looking paintings that emulate human art.
arXiv Detail & Related papers (2022-02-23T20:45:41Z) - Intelli-Paint: Towards Developing Human-like Painting Agents [19.261822105543175]
We propose a novel painting approach which learns to generate output canvases while exhibiting a more human-like painting style.
Intelli-Paint consists of 1) a progressive layering strategy which allows the agent to first paint a natural background scene representation before adding in each of the foreground objects in a progressive fashion.
We also introduce a novel sequential brushstroke guidance strategy which helps the painting agent to shift its attention between different image regions in a semantic-aware manner.
arXiv Detail & Related papers (2021-12-16T14:56:32Z) - Identifying centres of interest in paintings using alignment and edge
detection: Case studies on works by Luc Tuymans [1.8855270809505869]
We set the first preliminary steps to algorithmically deconstruct some of the transformations that an artist applies to an original image in order to establish centres of interest.
We introduce a comparative methodology that first cuts out the minimal segment from the original image on which the painting is based, then aligns the painting with this source, investigates micro-differences to identify centres of interest and attempts to understand their role.
arXiv Detail & Related papers (2021-01-04T10:04:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.