Differentiable Drawing and Sketching
- URL: http://arxiv.org/abs/2103.16194v1
- Date: Tue, 30 Mar 2021 09:25:55 GMT
- Title: Differentiable Drawing and Sketching
- Authors: Daniela Mihai and Jonathon Hare
- Abstract summary: We present a differentiable relaxation of the process of drawing points, lines and curves into a pixel.
This relaxation allows end-to-end differentiable programs and deep networks to be learned and optimised.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a bottom-up differentiable relaxation of the process of drawing
points, lines and curves into a pixel raster. Our approach arises from the
observation that rasterising a pixel in an image given parameters of a
primitive can be reformulated in terms of the primitive's distance transform,
and then relaxed to allow the primitive's parameters to be learned. This
relaxation allows end-to-end differentiable programs and deep networks to be
learned and optimised and provides several building blocks that allow control
over how a compositional drawing process is modelled. We emphasise the
bottom-up nature of our proposed approach, which allows for drawing operations
to be composed in ways that can mimic the physical reality of drawing rather
than being tied to, for example, approaches in modern computer graphics. With
the proposed approach we demonstrate how sketches can be generated by directly
optimising against photographs and how auto-encoders can be built to transform
rasterised handwritten digits into vectors without supervision. Extensive
experimental results highlight the power of this approach under different
modelling assumptions for drawing tasks.
Related papers
- Rasterized Edge Gradients: Handling Discontinuities Differentiably [25.85191317712521]
We present a novel method for computing gradients at discontinuities for rendering approximations.
Our method elegantly simplifies the traditionally complex problem through a carefully designed approximation strategy.
We showcase our method in human head scene reconstruction, demonstrating handling of camera images and segmentation masks.
arXiv Detail & Related papers (2024-05-03T22:42:00Z) - Sketch-guided Image Inpainting with Partial Discrete Diffusion Process [5.005162730122933]
We introduce a novel partial discrete diffusion process (PDDP) for sketch-guided inpainting.
PDDP corrupts the masked regions of the image and reconstructs these masked regions conditioned on hand-drawn sketches.
The proposed novel transformer module accepts two inputs -- the image containing the masked region to be inpainted and the query sketch to model the reverse diffusion process.
arXiv Detail & Related papers (2024-04-18T07:07:38Z) - CustomSketching: Sketch Concept Extraction for Sketch-based Image
Synthesis and Editing [21.12815542848095]
Personalization techniques for large text-to-image (T2I) models allow users to incorporate new concepts from reference images.
Existing methods primarily rely on textual descriptions, leading to limited control over customized images.
We identify sketches as an intuitive and versatile representation that can facilitate such control.
arXiv Detail & Related papers (2024-02-27T15:52:59Z) - Image Inpainting via Tractable Steering of Diffusion Models [54.13818673257381]
This paper proposes to exploit the ability of Tractable Probabilistic Models (TPMs) to exactly and efficiently compute the constrained posterior.
Specifically, this paper adopts a class of expressive TPMs termed Probabilistic Circuits (PCs)
We show that our approach can consistently improve the overall quality and semantic coherence of inpainted images with only 10% additional computational overhead.
arXiv Detail & Related papers (2023-11-28T21:14:02Z) - RISP: Rendering-Invariant State Predictor with Differentiable Simulation
and Rendering for Cross-Domain Parameter Estimation [110.4255414234771]
Existing solutions require massive training data or lack generalizability to unknown rendering configurations.
We propose a novel approach that marries domain randomization and differentiable rendering gradients to address this problem.
Our approach achieves significantly lower reconstruction errors and has better generalizability among unknown rendering configurations.
arXiv Detail & Related papers (2022-05-11T17:59:51Z) - Unsupervised Discovery of Disentangled Manifolds in GANs [74.24771216154105]
Interpretable generation process is beneficial to various image editing applications.
We propose a framework to discover interpretable directions in the latent space given arbitrary pre-trained generative adversarial networks.
arXiv Detail & Related papers (2020-11-24T02:18:08Z) - Stylized Neural Painting [0.0]
This paper proposes an image-to-painting translation method that generates vivid and realistic painting artworks with controllable styles.
Experiments show that the paintings generated by our method have a high degree of fidelity in both global appearance and local textures.
arXiv Detail & Related papers (2020-11-16T17:24:21Z) - Learning to Caricature via Semantic Shape Transform [95.25116681761142]
We propose an algorithm based on a semantic shape transform to produce shape exaggerations.
We show that the proposed framework is able to render visually pleasing shape exaggerations while maintaining their facial structures.
arXiv Detail & Related papers (2020-08-12T03:41:49Z) - CoSE: Compositional Stroke Embeddings [52.529172734044664]
We present a generative model for complex free-form structures such as stroke-based drawing tasks.
Our approach is suitable for interactive use cases such as auto-completing diagrams.
arXiv Detail & Related papers (2020-06-17T15:22:54Z) - Deep Plastic Surgery: Robust and Controllable Image Editing with
Human-Drawn Sketches [133.01690754567252]
Sketch-based image editing aims to synthesize and modify photos based on the structural information provided by the human-drawn sketches.
Deep Plastic Surgery is a novel, robust and controllable image editing framework that allows users to interactively edit images using hand-drawn sketch inputs.
arXiv Detail & Related papers (2020-01-09T08:57:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.