ISP: Multi-Layered Garment Draping with Implicit Sewing Patterns
- URL: http://arxiv.org/abs/2305.14100v2
- Date: Sat, 14 Oct 2023 15:09:43 GMT
- Title: ISP: Multi-Layered Garment Draping with Implicit Sewing Patterns
- Authors: Ren Li, Beno\^it Guillard, Pascal Fua
- Abstract summary: We introduce a garment representation model that addresses limitations of current approaches.
It is faster and yields higher quality reconstructions than purely implicit surface representations.
It supports rapid editing of garment shapes and texture by modifying individual 2D panels.
- Score: 57.176642106425895
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many approaches to draping individual garments on human body models are
realistic, fast, and yield outputs that are differentiable with respect to the
body shape on which they are draped. However, they are either unable to handle
multi-layered clothing, which is prevalent in everyday dress, or restricted to
bodies in T-pose. In this paper, we introduce a parametric garment
representation model that addresses these limitations. As in models used by
clothing designers, each garment consists of individual 2D panels. Their 2D
shape is defined by a Signed Distance Function and 3D shape by a 2D to 3D
mapping. The 2D parameterization enables easy detection of potential collisions
and the 3D parameterization handles complex shapes effectively. We show that
this combination is faster and yields higher quality reconstructions than
purely implicit surface representations, and makes the recovery of layered
garments from images possible thanks to its differentiability. Furthermore, it
supports rapid editing of garment shapes and texture by modifying individual 2D
panels.
Related papers
- Exploring Shape Embedding for Cloth-Changing Person Re-Identification
via 2D-3D Correspondences [9.487097819140653]
We propose a new shape embedding paradigm for cloth-changing ReID.
The shape embedding paradigm based on 2D-3D correspondences remarkably enhances the model's global understanding of human body shape.
To promote the study of ReID under clothing change, we construct 3D Dense Persons (DP3D), which is the first large-scale cloth-changing ReID dataset.
arXiv Detail & Related papers (2023-10-27T19:26:30Z) - Cloth2Body: Generating 3D Human Body Mesh from 2D Clothing [54.29207348918216]
Cloth2Body needs to address new and emerging challenges raised by the partial observation of the input and the high diversity of the output.
We propose an end-to-end framework that can accurately estimate 3D body mesh parameterized by pose and shape from a 2D clothing image.
As shown by experimental results, the proposed framework achieves state-of-the-art performance and can effectively recover natural and diverse 3D body meshes from 2D images.
arXiv Detail & Related papers (2023-09-28T06:18:38Z) - PersonalTailor: Personalizing 2D Pattern Design from 3D Garment Point
Clouds [59.617014796845865]
Garment pattern design aims to convert a 3D garment to the corresponding 2D panels and their sewing structure.
PersonalTailor is a personalized 2D pattern design method, where the user can input specific constraints or demands.
It first learns a multi-modal panel embeddings based on unsupervised cross-modal association and attentive fusion.
It then predicts a binary panel masks individually using a transformer encoder-decoder framework.
arXiv Detail & Related papers (2023-03-17T00:03:38Z) - ECON: Explicit Clothed humans Optimized via Normal integration [54.51948104460489]
We present ECON, a method for creating 3D humans in loose clothes.
It infers detailed 2D maps for the front and back side of a clothed person.
It "inpaints" the missing geometry between d-BiNI surfaces.
arXiv Detail & Related papers (2022-12-14T18:59:19Z) - Structure-Preserving 3D Garment Modeling with Neural Sewing Machines [190.70647799442565]
We propose a novel Neural Sewing Machine (NSM), a learning-based framework for structure-preserving 3D garment modeling.
NSM is capable of representing 3D garments under diverse garment shapes and topologies, realistically reconstructing 3D garments from 2D images with the preserved structure, and accurately manipulating the 3D garment categories, shapes, and topologies.
arXiv Detail & Related papers (2022-11-12T16:43:29Z) - Realistic, Animatable Human Reconstructions for Virtual Fit-On [0.7649716717097428]
We present an end-to-end virtual try-on pipeline, that can fit different clothes on a personalized 3-D human model.
Our main idea is to construct an animatable 3-D human model and try-on different clothes in a 3-D virtual environment.
arXiv Detail & Related papers (2022-10-16T13:36:24Z) - Cross-Modal 3D Shape Generation and Manipulation [62.50628361920725]
We propose a generic multi-modal generative model that couples the 2D modalities and implicit 3D representations through shared latent spaces.
We evaluate our framework on two representative 2D modalities of grayscale line sketches and rendered color images.
arXiv Detail & Related papers (2022-07-24T19:22:57Z) - Garment4D: Garment Reconstruction from Point Cloud Sequences [12.86951061306046]
Learning to reconstruct 3D garments is important for dressing 3D human bodies of different shapes in different poses.
Previous works typically rely on 2D images as input, which however suffer from the scale and pose ambiguities.
We propose a principled framework, Garment4D, that uses 3D point cloud sequences of dressed humans for garment reconstruction.
arXiv Detail & Related papers (2021-12-08T08:15:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.