End-to-End Chinese Landscape Painting Creation Using Generative
Adversarial Networks
- URL: http://arxiv.org/abs/2011.05552v1
- Date: Wed, 11 Nov 2020 05:20:42 GMT
- Title: End-to-End Chinese Landscape Painting Creation Using Generative
Adversarial Networks
- Authors: Alice Xue
- Abstract summary: We propose Sketch-And-Paint GAN (SAPGAN), the first model which generates Chinese landscape paintings from end to end, without conditional input.
SAPGAN is composed of two GANs: SketchGAN for generation of edge maps, and PaintGAN for subsequent edge-to-painting translation.
A 242-person Visual Turing Test study reveals that SAPGAN paintings are mistaken as human artwork with 55% frequency, significantly outperforming paintings from baseline GANs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current GAN-based art generation methods produce unoriginal artwork due to
their dependence on conditional input. Here, we propose Sketch-And-Paint GAN
(SAPGAN), the first model which generates Chinese landscape paintings from end
to end, without conditional input. SAPGAN is composed of two GANs: SketchGAN
for generation of edge maps, and PaintGAN for subsequent edge-to-painting
translation. Our model is trained on a new dataset of traditional Chinese
landscape paintings never before used for generative research. A 242-person
Visual Turing Test study reveals that SAPGAN paintings are mistaken as human
artwork with 55% frequency, significantly outperforming paintings from baseline
GANs. Our work lays a groundwork for truly machine-original art generation.
Related papers
- DLP-GAN: learning to draw modern Chinese landscape photos with
generative adversarial network [20.74857981451259]
Chinese landscape painting has a unique and artistic style, and its drawing technique is highly abstract in both the use of color and the realistic representation of objects.
Previous methods focus on transferring from modern photos to ancient ink paintings, but little attention has been paid to translating landscape paintings into modern photos.
arXiv Detail & Related papers (2024-03-06T04:46:03Z) - Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region [66.75826549444909]
Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
arXiv Detail & Related papers (2023-09-07T06:27:39Z) - Interactive Neural Painting [66.9376011879115]
This paper proposes the first approach for Interactive Neural Painting (NP)
We propose I-Paint, a novel method based on a conditional transformer Variational AutoEncoder (VAE) architecture with a two-stage decoder.
Our experiments show that our approach provides good stroke suggestions and compares favorably to the state of the art.
arXiv Detail & Related papers (2023-07-31T07:02:00Z) - CCLAP: Controllable Chinese Landscape Painting Generation via Latent
Diffusion Model [54.74470985388726]
controllable Chinese landscape painting generation method named CCLAP.
Our method achieves state-of-the-art performance, especially in artfully-composed and artistic conception.
arXiv Detail & Related papers (2023-04-09T04:16:28Z) - Art Creation with Multi-Conditional StyleGANs [81.72047414190482]
A human artist needs a combination of unique skills, understanding, and genuine intention to create artworks that evoke deep feelings and emotions.
We introduce a multi-conditional Generative Adversarial Network (GAN) approach trained on large amounts of human paintings to synthesize realistic-looking paintings that emulate human art.
arXiv Detail & Related papers (2022-02-23T20:45:41Z) - The Joy of Neural Painting [0.0]
We train a class of models that follows a GAN framework to generate brushstrokes, which are then composed to create paintings.
To overcome GAN's limitations and to speed up the Neural Painter training, we applied Transfer Learning to the process reducing it from days to only hours.
arXiv Detail & Related papers (2021-11-19T15:44:10Z) - Continuation of Famous Art with AI: A Conditional Adversarial Network
Inpainting Approach [1.713291434132985]
This work explores the application of image inpainting to continue famous artworks and produce generative art with a Conditional GAN.
An inpainting GAN is then tasked with learning to reconstruct the original image from the centre crop by way of minimising both adversarial and absolute difference losses.
Images are resized rather than cropped and presented as input to the generator.
Following the learning process, the generator then creates new images by continuing from the edges of the original piece.
arXiv Detail & Related papers (2021-10-18T10:39:32Z) - A Framework and Dataset for Abstract Art Generation via CalligraphyGAN [0.0]
We present a creative framework based on Conditional Generative Adversarial Networks and Contextual Neural Language Model to generate abstract artworks.
Our work is inspired by Chinese calligraphy, which is a unique form of visual art where the character itself is an aesthetic painting.
arXiv Detail & Related papers (2020-12-02T16:24:20Z) - XingGAN for Person Image Generation [149.54517767056382]
We propose a novel Generative Adversarial Network (XingGAN) for person image generation tasks.
XingGAN consists of two generation branches that model the person's appearance and shape information.
We show that the proposed XingGAN advances the state-of-the-art performance in terms of objective quantitative scores and subjective visual realness.
arXiv Detail & Related papers (2020-07-17T23:40:22Z) - Sketch-Guided Scenery Image Outpainting [83.6612152173028]
We propose an encoder-decoder based network to conduct sketch-guided outpainting.
We apply a holistic alignment module to make the synthesized part be similar to the real one from the global view.
Second, we reversely produce the sketches from the synthesized part and encourage them be consistent with the ground-truth ones.
arXiv Detail & Related papers (2020-06-17T11:34:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.