Conditional Human Sketch Synthesis with Explicit Abstraction Control
- URL: http://arxiv.org/abs/2306.09274v1
- Date: Thu, 15 Jun 2023 16:54:58 GMT
- Title: Conditional Human Sketch Synthesis with Explicit Abstraction Control
- Authors: Dar-Yen Chen
- Abstract summary: We present a novel free-hand sketch synthesis approach addressing explicit abstraction control in class-conditional and photo-to-sketch synthesis.
We propose two novel abstraction control mechanisms, state embeddings and the stroke token, integrated into a transformer-based latent diffusion model.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a novel free-hand sketch synthesis approach addressing
explicit abstraction control in class-conditional and photo-to-sketch
synthesis. Abstraction is a vital aspect of sketches, as it defines the
fundamental distinction between a sketch and an image. Previous works relied on
implicit control to achieve different levels of abstraction, leading to
inaccurate control and synthesized sketches deviating from human sketches. To
resolve this challenge, we propose two novel abstraction control mechanisms,
state embeddings and the stroke token, integrated into a transformer-based
latent diffusion model (LDM). These mechanisms explicitly provide the required
amount of points or strokes to the model, enabling accurate point-level and
stroke-level control in synthesized sketches while preserving recognizability.
Outperforming state-of-the-art approaches, our method effectively generates
diverse, non-rigid and human-like sketches. The proposed approach enables
coherent sketch synthesis and excels in representing human habits with desired
abstraction levels, highlighting the potential of sketch synthesis for
real-world applications.
Related papers
- It's All About Your Sketch: Democratising Sketch Control in Diffusion Models [114.73766136068357]
This paper unravels the potential of sketches for diffusion models, addressing the deceptive promise of direct sketch control in generative AI.
We importantly democratise the process, enabling amateur sketches to generate precise images, living up to the commitment of "what you sketch is what you get"
arXiv Detail & Related papers (2024-03-12T01:05:25Z) - How to Handle Sketch-Abstraction in Sketch-Based Image Retrieval? [120.49126407479717]
We propose a sketch-based image retrieval framework capable of handling sketch abstraction at varied levels.
For granularity-level abstraction understanding, we dictate that the retrieval model should not treat all abstraction-levels equally.
Our Acc.@q loss uniquely allows a sketch to narrow/broaden its focus in terms of how stringent the evaluation should be.
arXiv Detail & Related papers (2024-03-11T23:08:29Z) - CustomSketching: Sketch Concept Extraction for Sketch-based Image
Synthesis and Editing [21.12815542848095]
Personalization techniques for large text-to-image (T2I) models allow users to incorporate new concepts from reference images.
Existing methods primarily rely on textual descriptions, leading to limited control over customized images.
We identify sketches as an intuitive and versatile representation that can facilitate such control.
arXiv Detail & Related papers (2024-02-27T15:52:59Z) - DiffSketching: Sketch Control Image Synthesis with Diffusion Models [10.172753521953386]
Deep learning models for sketch-to-image synthesis need to overcome the distorted input sketch without visual details.
Our model matches sketches through the cross domain constraints, and uses a classifier to guide the image synthesis more accurately.
Our model can beat GAN-based method in terms of generation quality and human evaluation, and does not rely on massive sketch-image datasets.
arXiv Detail & Related papers (2023-05-30T07:59:23Z) - Semantics-Preserving Sketch Embedding for Face Generation [26.15479367792076]
We introduce a novel W-W+ encoder architecture to take advantage of the high expressive power of W+ space.
We also introduce an explicit intermediate representation for sketch semantic embedding.
A novel sketch semantic interpretation approach is designed to automatically extract semantics from vectorized sketches.
arXiv Detail & Related papers (2022-11-23T15:14:49Z) - I Know What You Draw: Learning Grasp Detection Conditioned on a Few
Freehand Sketches [74.63313641583602]
We propose a method to generate a potential grasp configuration relevant to the sketch-depicted objects.
Our model is trained and tested in an end-to-end manner which is easy to be implemented in real-world applications.
arXiv Detail & Related papers (2022-05-09T04:23:36Z) - Semantic View Synthesis [56.47999473206778]
We tackle a new problem of semantic view synthesis -- generating free-viewpoint rendering of a synthesized scene using a semantic label map as input.
First, we focus on synthesizing the color and depth of the visible surface of the 3D scene.
We then use the synthesized color and depth to impose explicit constraints on the multiple-plane image (MPI) representation prediction process.
arXiv Detail & Related papers (2020-08-24T17:59:46Z) - Example-Guided Image Synthesis across Arbitrary Scenes using Masked
Spatial-Channel Attention and Self-Supervision [83.33283892171562]
Example-guided image synthesis has recently been attempted to synthesize an image from a semantic label map and an exemplary image.
In this paper, we tackle a more challenging and general task, where the exemplar is an arbitrary scene image that is semantically different from the given label map.
We propose an end-to-end network for joint global and local feature alignment and synthesis.
arXiv Detail & Related papers (2020-04-18T18:17:40Z) - Deep Plastic Surgery: Robust and Controllable Image Editing with
Human-Drawn Sketches [133.01690754567252]
Sketch-based image editing aims to synthesize and modify photos based on the structural information provided by the human-drawn sketches.
Deep Plastic Surgery is a novel, robust and controllable image editing framework that allows users to interactively edit images using hand-drawn sketch inputs.
arXiv Detail & Related papers (2020-01-09T08:57:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.