Sketch2Cloth: Sketch-based 3D Garment Generation with Unsigned Distance
Fields
- URL: http://arxiv.org/abs/2303.00167v1
- Date: Wed, 1 Mar 2023 01:45:28 GMT
- Title: Sketch2Cloth: Sketch-based 3D Garment Generation with Unsigned Distance
Fields
- Authors: Yi He, Haoran Xie and Kazunori Miyata
- Abstract summary: We propose Sketch2Cloth, a sketch-based 3D garment generation system using the unsigned distance fields from the user's sketch input.
Sketch2Cloth first estimates the unsigned distance function of the target 3D model from the sketch input, and extracts the mesh from the estimated field with Marching Cubes.
We also provide the model editing function to modify the generated mesh.
- Score: 12.013968508918634
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: 3D model reconstruction from a single image has achieved great progress with
the recent deep generative models. However, the conventional reconstruction
approaches with template mesh deformation and implicit fields have difficulty
in reconstructing non-watertight 3D mesh models, such as garments. In contrast
to image-based modeling, the sketch-based approach can help users generate 3D
models to meet the design intentions from hand-drawn sketches. In this study,
we propose Sketch2Cloth, a sketch-based 3D garment generation system using the
unsigned distance fields from the user's sketch input. Sketch2Cloth first
estimates the unsigned distance function of the target 3D model from the sketch
input, and extracts the mesh from the estimated field with Marching Cubes. We
also provide the model editing function to modify the generated mesh. We
verified the proposed Sketch2Cloth with quantitative evaluations on garment
generation and editing with a state-of-the-art approach.
Related papers
- Sketch3D: Style-Consistent Guidance for Sketch-to-3D Generation [55.73399465968594]
This paper proposes a novel generation paradigm Sketch3D to generate realistic 3D assets with shape aligned with the input sketch and color matching the textual description.
Three strategies are designed to optimize 3D Gaussians, i.e., structural optimization via a distribution transfer mechanism, color optimization with a straightforward MSE loss and sketch similarity optimization with a CLIP-based geometric similarity loss.
arXiv Detail & Related papers (2024-04-02T11:03:24Z) - Doodle Your 3D: From Abstract Freehand Sketches to Precise 3D Shapes [118.406721663244]
We introduce a novel part-level modelling and alignment framework that facilitates abstraction modelling and cross-modal correspondence.
Our approach seamlessly extends to sketch modelling by establishing correspondence between CLIPasso edgemaps and projected 3D part regions.
arXiv Detail & Related papers (2023-12-07T05:04:33Z) - 3D VR Sketch Guided 3D Shape Prototyping and Exploration [108.6809158245037]
We propose a 3D shape generation network that takes a 3D VR sketch as a condition.
We assume that sketches are created by novices without art training.
Our method creates multiple 3D shapes that align with the original sketch's structure.
arXiv Detail & Related papers (2023-06-19T10:27:24Z) - SENS: Part-Aware Sketch-based Implicit Neural Shape Modeling [124.3266213819203]
We present SENS, a novel method for generating and editing 3D models from hand-drawn sketches.
S SENS analyzes the sketch and encodes its parts into ViT patch encoding.
S SENS supports refinement via part reconstruction, allowing for nuanced adjustments and artifact removal.
arXiv Detail & Related papers (2023-06-09T17:50:53Z) - Make Your Brief Stroke Real and Stereoscopic: 3D-Aware Simplified Sketch
to Portrait Generation [51.64832538714455]
Existing studies only generate portraits in the 2D plane with fixed views, making the results less vivid.
In this paper, we present Stereoscopic Simplified Sketch-to-Portrait (SSSP), which explores the possibility of creating Stereoscopic 3D-aware portraits.
Our key insight is to design sketch-aware constraints that can fully exploit the prior knowledge of a tri-plane-based 3D-aware generative model.
arXiv Detail & Related papers (2023-02-14T06:28:42Z) - TreeSketchNet: From Sketch To 3D Tree Parameters Generation [4.234843176066354]
3D modeling of non-linear objects from stylized sketches is a challenge even for experts in computer graphics.
We propose a broker system that mediates between the modeler and the 3D modelling software.
arXiv Detail & Related papers (2022-07-25T16:08:05Z) - SingleSketch2Mesh : Generating 3D Mesh model from Sketch [1.6973426830397942]
Current methods to generate 3D models from sketches are either manual or tightly coupled with 3D modeling platforms.
We propose a novel AI based ensemble approach, SingleSketch2Mesh, for generating 3D models from hand-drawn sketches.
arXiv Detail & Related papers (2022-03-07T06:30:36Z) - Sketch2Model: View-Aware 3D Modeling from Single Free-Hand Sketches [4.781615891172263]
We investigate the problem of generating 3D meshes from single free-hand sketches, aiming at fast 3D modeling for novice users.
We address the importance of viewpoint specification for overcoming ambiguities, and propose a novel view-aware generation approach.
arXiv Detail & Related papers (2021-05-14T06:27:48Z) - Sketch2Mesh: Reconstructing and Editing 3D Shapes from Sketches [65.96417928860039]
We use an encoder/decoder architecture for the sketch to mesh translation.
We will show that this approach is easy to deploy, robust to style changes, and effective.
arXiv Detail & Related papers (2021-04-01T14:10:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.