Garment Recovery with Shape and Deformation Priors
- URL: http://arxiv.org/abs/2311.10356v2
- Date: Mon, 11 Mar 2024 19:27:35 GMT
- Title: Garment Recovery with Shape and Deformation Priors
- Authors: Ren Li, Corentin Dumery, Beno\^it Guillard, Pascal Fua
- Abstract summary: We propose a method that delivers realistic garment models from real-world images, regardless of garment shape or deformation.
Not only does our approach recover the garment geometry accurately, it also yields models that can be directly used by downstream applications such as animation and simulation.
- Score: 51.41962835642731
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While modeling people wearing tight-fitting clothing has made great strides
in recent years, loose-fitting clothing remains a challenge. We propose a
method that delivers realistic garment models from real-world images,
regardless of garment shape or deformation. To this end, we introduce a fitting
approach that utilizes shape and deformation priors learned from synthetic data
to accurately capture garment shapes and deformations, including large ones.
Not only does our approach recover the garment geometry accurately, it also
yields models that can be directly used by downstream applications such as
animation and simulation.
Related papers
- Reconstruction of Manipulated Garment with Guided Deformation Prior [46.225397746681736]
We leverage the implicit sewing patterns (ISP) model for garment modeling and extend it by adding a diffusion-based deformation prior to represent shapes.
To recover 3D garment shapes from incomplete 3D point clouds acquired when the garment is folded, we map the points to UV space, in which our priors are learned, to produce partial UV maps, and then fit the priors to recover complete UV maps and 2D to 3D mappings.
arXiv Detail & Related papers (2024-05-17T17:39:29Z) - ShapeBoost: Boosting Human Shape Estimation with Part-Based
Parameterization and Clothing-Preserving Augmentation [58.50613393500561]
We propose ShapeBoost, a new human shape recovery framework.
It achieves pixel-level alignment even for rare body shapes and high accuracy for people wearing different types of clothes.
arXiv Detail & Related papers (2024-03-02T23:40:23Z) - AniDress: Animatable Loose-Dressed Avatar from Sparse Views Using
Garment Rigging Model [58.035758145894846]
We introduce AniDress, a novel method for generating animatable human avatars in loose clothes using very sparse multi-view videos.
A pose-driven deformable neural radiance field conditioned on both body and garment motions is introduced, providing explicit control of both parts.
Our method is able to render natural garment dynamics that deviate highly from the body and well to generalize to both unseen views and poses.
arXiv Detail & Related papers (2024-01-27T08:48:18Z) - SPnet: Estimating Garment Sewing Patterns from a Single Image [10.604555099281173]
This paper presents a novel method for reconstructing 3D garment models from a single image of a posed user.
By inferring the fundamental shape of the garment through sewing patterns from a single image, we can generate 3D garments that can adaptively deform to arbitrary poses.
arXiv Detail & Related papers (2023-12-26T09:51:25Z) - Towards Loose-Fitting Garment Animation via Generative Model of
Deformation Decomposition [4.627632792164547]
We develop a garment generative model based on deformation decomposition to efficiently simulate loose garment deformation without using linear skinning.
We demonstrate our method outperforms state-of-the-art data-driven alternatives through extensive experiments and show qualitative and quantitative analysis of results.
arXiv Detail & Related papers (2023-12-22T11:26:51Z) - GAPS: Geometry-Aware, Physics-Based, Self-Supervised Neural Garment Draping [8.60320342646772]
Recent neural, physics-based modeling of garment deformations allows faster and visually aesthetic results.
Material-specific parameters are used by the formulation to control the garment inextensibility.
We propose a geometry-aware garment skinning method by defining a body-garment closeness measure.
arXiv Detail & Related papers (2023-12-03T19:21:53Z) - Towards Garment Sewing Pattern Reconstruction from a Single Image [76.97825595711444]
Garment sewing pattern represents the intrinsic rest shape of a garment, and is the core for many applications like fashion design, virtual try-on, and digital avatars.
We first synthesize a versatile dataset, named SewFactory, which consists of around 1M images and ground-truth sewing patterns.
We then propose a two-level Transformer network called Sewformer, which significantly improves the sewing pattern prediction performance.
arXiv Detail & Related papers (2023-11-07T18:59:51Z) - DIG: Draping Implicit Garment over the Human Body [56.68349332089129]
We propose an end-to-end differentiable pipeline that represents garments using implicit surfaces and learns a skinning field conditioned on shape and pose parameters of an articulated body model.
We show that our method, thanks to its end-to-end differentiability, allows to recover body and garments parameters jointly from image observations.
arXiv Detail & Related papers (2022-09-22T08:13:59Z) - Self-Supervised Collision Handling via Generative 3D Garment Models for
Virtual Try-On [29.458328272854107]
We propose a new generative model for 3D garment deformations that enables us to learn, for the first time, a data-driven method for virtual try-on.
We show that our method is the first to successfully address garment-body contact in unseen body shapes and motions, without compromising realism and detail.
arXiv Detail & Related papers (2021-05-13T17:58:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.