Single View Garment Reconstruction Using Diffusion Mapping Via Pattern Coordinates
- URL: http://arxiv.org/abs/2504.08353v1
- Date: Fri, 11 Apr 2025 08:39:18 GMT
- Title: Single View Garment Reconstruction Using Diffusion Mapping Via Pattern Coordinates
- Authors: Ren Li, Cong Cao, Corentin Dumery, Yingxuan You, Hao Li, Pascal Fua,
- Abstract summary: Reconstructing 3D clothed humans from images is fundamental to applications like virtual try-on, avatar creation, and mixed reality.<n>We present a novel method for high-fidelity 3D garment reconstruction from single images that bridges 2D and 3D representations.
- Score: 45.48311596587306
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Reconstructing 3D clothed humans from images is fundamental to applications like virtual try-on, avatar creation, and mixed reality. While recent advances have enhanced human body recovery, accurate reconstruction of garment geometry -- especially for loose-fitting clothing -- remains an open challenge. We present a novel method for high-fidelity 3D garment reconstruction from single images that bridges 2D and 3D representations. Our approach combines Implicit Sewing Patterns (ISP) with a generative diffusion model to learn rich garment shape priors in a 2D UV space. A key innovation is our mapping model that establishes correspondences between 2D image pixels, UV pattern coordinates, and 3D geometry, enabling joint optimization of both 3D garment meshes and the corresponding 2D patterns by aligning learned priors with image observations. Despite training exclusively on synthetically simulated cloth data, our method generalizes effectively to real-world images, outperforming existing approaches on both tight- and loose-fitting garments. The reconstructed garments maintain physical plausibility while capturing fine geometric details, enabling downstream applications including garment retargeting and texture manipulation.
Related papers
- DiffusedWrinkles: A Diffusion-Based Model for Data-Driven Garment Animation [10.9550231281676]
We present a data-driven method for learning to generate animations of 3D garments using a 2D image diffusion model.
Our approach is able to synthesize high-quality 3D animations for a wide variety of garments and body shapes.
arXiv Detail & Related papers (2025-03-24T06:08:26Z) - Dress-1-to-3: Single Image to Simulation-Ready 3D Outfit with Diffusion Prior and Differentiable Physics [27.697150953628572]
This paper focuses on 3D garment generation, a key area for applications like virtual try-on with dynamic garment animations.<n>We introduce Dress-1-to-3, a novel pipeline that reconstructs physics-plausible, simulation-ready separated garments with sewing patterns and humans from an in-the-wild image.
arXiv Detail & Related papers (2025-02-05T18:49:03Z) - AG3D: Learning to Generate 3D Avatars from 2D Image Collections [96.28021214088746]
We propose a new adversarial generative model of realistic 3D people from 2D images.
Our method captures shape and deformation of the body and loose clothing by adopting a holistic 3D generator.
We experimentally find that our method outperforms previous 3D- and articulation-aware methods in terms of geometry and appearance.
arXiv Detail & Related papers (2023-05-03T17:56:24Z) - USR: Unsupervised Separated 3D Garment and Human Reconstruction via
Geometry and Semantic Consistency [41.89803177312638]
We propose an unsupervised separated 3D garments and human reconstruction model (USR), which reconstructs the human body and authentic textured clothes in layers without 3D models.
Our method proposes a generalized surface-aware neural radiance field to learn the mapping between sparse multi-view images and geometries of the dressed people.
arXiv Detail & Related papers (2023-02-21T08:48:27Z) - Structure-Preserving 3D Garment Modeling with Neural Sewing Machines [190.70647799442565]
We propose a novel Neural Sewing Machine (NSM), a learning-based framework for structure-preserving 3D garment modeling.
NSM is capable of representing 3D garments under diverse garment shapes and topologies, realistically reconstructing 3D garments from 2D images with the preserved structure, and accurately manipulating the 3D garment categories, shapes, and topologies.
arXiv Detail & Related papers (2022-11-12T16:43:29Z) - 3D Magic Mirror: Clothing Reconstruction from a Single Image via a
Causal Perspective [96.65476492200648]
This research aims to study a self-supervised 3D clothing reconstruction method.
It recovers the geometry shape, and texture of human clothing from a single 2D image.
arXiv Detail & Related papers (2022-04-27T17:46:55Z) - Garment4D: Garment Reconstruction from Point Cloud Sequences [12.86951061306046]
Learning to reconstruct 3D garments is important for dressing 3D human bodies of different shapes in different poses.
Previous works typically rely on 2D images as input, which however suffer from the scale and pose ambiguities.
We propose a principled framework, Garment4D, that uses 3D point cloud sequences of dressed humans for garment reconstruction.
arXiv Detail & Related papers (2021-12-08T08:15:20Z) - Deep Fashion3D: A Dataset and Benchmark for 3D Garment Reconstruction
from Single Images [50.34202789543989]
Deep Fashion3D is the largest collection to date of 3D garment models.
It provides rich annotations including 3D feature lines, 3D body pose and the corresponded multi-view real images.
A novel adaptable template is proposed to enable the learning of all types of clothing in a single network.
arXiv Detail & Related papers (2020-03-28T09:20:04Z) - Learning to Transfer Texture from Clothing Images to 3D Humans [50.838970996234465]
We present a method to automatically transfer textures of clothing images to 3D garments worn on top SMPL, in real time.
We first compute training pairs of images with aligned 3D garments using a custom non-rigid 3D to 2D registration method, which is accurate but slow.
Our model opens the door to applications such as virtual try-on, and allows for generation of 3D humans with varied textures which is necessary for learning.
arXiv Detail & Related papers (2020-03-04T12:53:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.