Reconstruction of Manipulated Garment with Guided Deformation Prior
- URL: http://arxiv.org/abs/2405.10934v2
- Date: Sun, 13 Oct 2024 12:43:10 GMT
- Title: Reconstruction of Manipulated Garment with Guided Deformation Prior
- Authors: Ren Li, Corentin Dumery, Zhantao Deng, Pascal Fua,
- Abstract summary: We leverage the implicit sewing patterns (ISP) model for garment modeling and extend it by adding a diffusion-based deformation prior to represent shapes.
To recover 3D garment shapes from incomplete 3D point clouds acquired when the garment is folded, we map the points to UV space, in which our priors are learned, to produce partial UV maps, and then fit the priors to recover complete UV maps and 2D to 3D mappings.
- Score: 46.225397746681736
- License:
- Abstract: Modeling the shape of garments has received much attention, but most existing approaches assume the garments to be worn by someone, which constrains the range of shapes they can assume. In this work, we address shape recovery when garments are being manipulated instead of worn, which gives rise to an even larger range of possible shapes. To this end, we leverage the implicit sewing patterns (ISP) model for garment modeling and extend it by adding a diffusion-based deformation prior to represent these shapes. To recover 3D garment shapes from incomplete 3D point clouds acquired when the garment is folded, we map the points to UV space, in which our priors are learned, to produce partial UV maps, and then fit the priors to recover complete UV maps and 2D to 3D mappings. Experimental results demonstrate the superior reconstruction accuracy of our method compared to previous ones, especially when dealing with large non-rigid deformations arising from the manipulations.
Related papers
- SPnet: Estimating Garment Sewing Patterns from a Single Image [10.604555099281173]
This paper presents a novel method for reconstructing 3D garment models from a single image of a posed user.
By inferring the fundamental shape of the garment through sewing patterns from a single image, we can generate 3D garments that can adaptively deform to arbitrary poses.
arXiv Detail & Related papers (2023-12-26T09:51:25Z) - Nuvo: Neural UV Mapping for Unruly 3D Representations [61.87715912587394]
Existing UV mapping algorithms operate on geometry produced by state-of-the-art 3D reconstruction and generation techniques.
We present a UV mapping method designed to operate on geometry produced by 3D reconstruction and generation techniques.
arXiv Detail & Related papers (2023-12-11T18:58:38Z) - Garment Recovery with Shape and Deformation Priors [51.41962835642731]
We propose a method that delivers realistic garment models from real-world images, regardless of garment shape or deformation.
Not only does our approach recover the garment geometry accurately, it also yields models that can be directly used by downstream applications such as animation and simulation.
arXiv Detail & Related papers (2023-11-17T07:06:21Z) - ISP: Multi-Layered Garment Draping with Implicit Sewing Patterns [57.176642106425895]
We introduce a garment representation model that addresses limitations of current approaches.
It is faster and yields higher quality reconstructions than purely implicit surface representations.
It supports rapid editing of garment shapes and texture by modifying individual 2D panels.
arXiv Detail & Related papers (2023-05-23T14:23:48Z) - DrapeNet: Garment Generation and Self-Supervised Draping [95.0315186890655]
We rely on self-supervision to train a single network to drape multiple garments.
This is achieved by predicting a 3D deformation field conditioned on the latent codes of a generative network.
Our pipeline can generate and drape previously unseen garments of any topology.
arXiv Detail & Related papers (2022-11-21T09:13:53Z) - Structure-Preserving 3D Garment Modeling with Neural Sewing Machines [190.70647799442565]
We propose a novel Neural Sewing Machine (NSM), a learning-based framework for structure-preserving 3D garment modeling.
NSM is capable of representing 3D garments under diverse garment shapes and topologies, realistically reconstructing 3D garments from 2D images with the preserved structure, and accurately manipulating the 3D garment categories, shapes, and topologies.
arXiv Detail & Related papers (2022-11-12T16:43:29Z) - xCloth: Extracting Template-free Textured 3D Clothes from a Monocular
Image [4.056667956036515]
We present a novel framework for template-free textured 3D garment digitization.
More specifically, we propose to extend PeeledHuman representation to predict the pixel-aligned, layered depth and semantic maps.
We achieve high-fidelity 3D garment reconstruction results on three publicly available datasets and generalization on internet images.
arXiv Detail & Related papers (2022-08-27T05:57:00Z) - NeuralTailor: Reconstructing Sewing Pattern Structures from 3D Point
Clouds of Garments [7.331799534004012]
We propose to use a garment sewing pattern to facilitate the intrinsic garment shape estimation.
We introduce NeuralTailor, a novel architecture based on point-level attention for set regression with variable cardinality.
Our experiments show that NeuralTailor successfully reconstructs sewing patterns and generalizes to garment types with pattern topologies unseen during training.
arXiv Detail & Related papers (2022-01-31T08:33:49Z) - Fully Convolutional Graph Neural Networks for Parametric Virtual Try-On [9.293488420613148]
We present a learning-based approach for virtual try-on applications based on a fully convolutional graph neural network.
In contrast to existing data-driven models, which are trained for a specific garment or mesh topology, our fully convolutional model can cope with a large family of garments.
Under the hood, our novel geometric deep learning approach learns to drape 3D garments by decoupling the three different sources of deformations that condition the fit of clothing.
arXiv Detail & Related papers (2020-09-09T22:38:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.