DIG: Draping Implicit Garment over the Human Body
- URL: http://arxiv.org/abs/2209.10845v2
- Date: Sat, 24 Sep 2022 12:53:32 GMT
- Title: DIG: Draping Implicit Garment over the Human Body
- Authors: Ren Li, Beno\^it Guillard, Edoardo Remelli, Pascal Fua
- Abstract summary: We propose an end-to-end differentiable pipeline that represents garments using implicit surfaces and learns a skinning field conditioned on shape and pose parameters of an articulated body model.
We show that our method, thanks to its end-to-end differentiability, allows to recover body and garments parameters jointly from image observations.
- Score: 56.68349332089129
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Existing data-driven methods for draping garments over human bodies, despite
being effective, cannot handle garments of arbitrary topology and are typically
not end-to-end differentiable. To address these limitations, we propose an
end-to-end differentiable pipeline that represents garments using implicit
surfaces and learns a skinning field conditioned on shape and pose parameters
of an articulated body model. To limit body-garment interpenetrations and
artifacts, we propose an interpenetration-aware pre-processing strategy of
training data and a novel training loss that penalizes self-intersections while
draping garments. We demonstrate that our method yields more accurate results
for garment reconstruction and deformation with respect to state of the art
methods. Furthermore, we show that our method, thanks to its end-to-end
differentiability, allows to recover body and garments parameters jointly from
image observations, something that previous work could not do.
Related papers
- Neural Garment Dynamics via Manifold-Aware Transformers [26.01911475040001]
We take a different approach and model the dynamics of a garment by exploiting its local interactions with the underlying human body.
Specifically, as the body moves, we detect local garment-body collisions, which drive the deformation of the garment.
At the core of our approach is a mesh-agnostic garment representation and a manifold-aware transformer network design.
arXiv Detail & Related papers (2024-05-13T11:05:52Z) - Towards Loose-Fitting Garment Animation via Generative Model of
Deformation Decomposition [4.627632792164547]
We develop a garment generative model based on deformation decomposition to efficiently simulate loose garment deformation without using linear skinning.
We demonstrate our method outperforms state-of-the-art data-driven alternatives through extensive experiments and show qualitative and quantitative analysis of results.
arXiv Detail & Related papers (2023-12-22T11:26:51Z) - GAPS: Geometry-Aware, Physics-Based, Self-Supervised Neural Garment Draping [8.60320342646772]
Recent neural, physics-based modeling of garment deformations allows faster and visually aesthetic results.
Material-specific parameters are used by the formulation to control the garment inextensibility.
We propose a geometry-aware garment skinning method by defining a body-garment closeness measure.
arXiv Detail & Related papers (2023-12-03T19:21:53Z) - Garment Recovery with Shape and Deformation Priors [51.41962835642731]
We propose a method that delivers realistic garment models from real-world images, regardless of garment shape or deformation.
Not only does our approach recover the garment geometry accurately, it also yields models that can be directly used by downstream applications such as animation and simulation.
arXiv Detail & Related papers (2023-11-17T07:06:21Z) - HOOD: Hierarchical Graphs for Generalized Modelling of Clothing Dynamics [84.29846699151288]
Our method is agnostic to body shape and applies to tight-fitting garments as well as loose, free-flowing clothing.
As one key contribution, we propose a hierarchical message-passing scheme that efficiently propagates stiff stretching modes.
arXiv Detail & Related papers (2022-12-14T14:24:00Z) - DrapeNet: Garment Generation and Self-Supervised Draping [95.0315186890655]
We rely on self-supervision to train a single network to drape multiple garments.
This is achieved by predicting a 3D deformation field conditioned on the latent codes of a generative network.
Our pipeline can generate and drape previously unseen garments of any topology.
arXiv Detail & Related papers (2022-11-21T09:13:53Z) - Towards Scalable Unpaired Virtual Try-On via Patch-Routed
Spatially-Adaptive GAN [66.3650689395967]
We propose a texture-preserving end-to-end network, the PAtch-routed SpaTially-Adaptive GAN (PASTA-GAN), that facilitates real-world unpaired virtual try-on.
To disentangle the style and spatial information of each garment, PASTA-GAN consists of an innovative patch-routed disentanglement module.
arXiv Detail & Related papers (2021-11-20T08:36:12Z) - BCNet: Learning Body and Cloth Shape from A Single Image [56.486796244320125]
We propose a layered garment representation on top of SMPL and novelly make the skinning weight of garment independent of the body mesh.
Compared with existing methods, our method can support more garment categories and recover more accurate geometry.
arXiv Detail & Related papers (2020-04-01T03:41:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.