SPnet: Estimating Garment Sewing Patterns from a Single Image
- URL: http://arxiv.org/abs/2312.16264v1
- Date: Tue, 26 Dec 2023 09:51:25 GMT
- Title: SPnet: Estimating Garment Sewing Patterns from a Single Image
- Authors: Seungchan Lim, Sumin Kim, Sung-Hee Lee
- Abstract summary: This paper presents a novel method for reconstructing 3D garment models from a single image of a posed user.
By inferring the fundamental shape of the garment through sewing patterns from a single image, we can generate 3D garments that can adaptively deform to arbitrary poses.
- Score: 10.604555099281173
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a novel method for reconstructing 3D garment models from
a single image of a posed user. Previous studies that have primarily focused on
accurately reconstructing garment geometries to match the input garment image
may often result in unnatural-looking garments when deformed for new poses. To
overcome this limitation, our approach takes a different approach by inferring
the fundamental shape of the garment through sewing patterns from a single
image, rather than directly reconstructing 3D garments. Our method consists of
two stages. Firstly, given a single image of a posed user, it predicts the
garment image worn on a T-pose, representing the baseline form of the garment.
Then, it estimates the sewing pattern parameters based on the T-pose garment
image. By simulating the stitching and draping of the sewing pattern using
physics simulation, we can generate 3D garments that can adaptively deform to
arbitrary poses. The effectiveness of our method is validated through ablation
studies on the major components and a comparison with other approaches.
Related papers
- Reconstruction of Manipulated Garment with Guided Deformation Prior [46.225397746681736]
We leverage the implicit sewing patterns (ISP) model for garment modeling and extend it by adding a diffusion-based deformation prior to represent shapes.
To recover 3D garment shapes from incomplete 3D point clouds acquired when the garment is folded, we map the points to UV space, in which our priors are learned, to produce partial UV maps, and then fit the priors to recover complete UV maps and 2D to 3D mappings.
arXiv Detail & Related papers (2024-05-17T17:39:29Z) - Garment Recovery with Shape and Deformation Priors [51.41962835642731]
We propose a method that delivers realistic garment models from real-world images, regardless of garment shape or deformation.
Not only does our approach recover the garment geometry accurately, it also yields models that can be directly used by downstream applications such as animation and simulation.
arXiv Detail & Related papers (2023-11-17T07:06:21Z) - Towards Garment Sewing Pattern Reconstruction from a Single Image [76.97825595711444]
Garment sewing pattern represents the intrinsic rest shape of a garment, and is the core for many applications like fashion design, virtual try-on, and digital avatars.
We first synthesize a versatile dataset, named SewFactory, which consists of around 1M images and ground-truth sewing patterns.
We then propose a two-level Transformer network called Sewformer, which significantly improves the sewing pattern prediction performance.
arXiv Detail & Related papers (2023-11-07T18:59:51Z) - DeepIron: Predicting Unwarped Garment Texture from a Single Image [9.427635404752934]
This paper presents a novel framework that reconstructs the texture map for 3D garments from a single image with pose.
A key component of our framework, the Texture Unwarper, infers the original texture image from the input clothing image.
By inferring the unwarped original texture of the input garment, our method helps reconstruct 3D garment models that can show high-quality texture images realistically deformed for new poses.
arXiv Detail & Related papers (2023-10-24T01:44:11Z) - Structure-Preserving 3D Garment Modeling with Neural Sewing Machines [190.70647799442565]
We propose a novel Neural Sewing Machine (NSM), a learning-based framework for structure-preserving 3D garment modeling.
NSM is capable of representing 3D garments under diverse garment shapes and topologies, realistically reconstructing 3D garments from 2D images with the preserved structure, and accurately manipulating the 3D garment categories, shapes, and topologies.
arXiv Detail & Related papers (2022-11-12T16:43:29Z) - NeuralTailor: Reconstructing Sewing Pattern Structures from 3D Point
Clouds of Garments [7.331799534004012]
We propose to use a garment sewing pattern to facilitate the intrinsic garment shape estimation.
We introduce NeuralTailor, a novel architecture based on point-level attention for set regression with variable cardinality.
Our experiments show that NeuralTailor successfully reconstructs sewing patterns and generalizes to garment types with pattern topologies unseen during training.
arXiv Detail & Related papers (2022-01-31T08:33:49Z) - Neural 3D Clothes Retargeting from a Single Image [91.5030622330039]
We present a method of clothes; generating the potential poses and deformations of a given 3D clothing template model to fit onto a person in a single RGB image.
The problem is fundamentally ill-posed as attaining the ground truth data is impossible, i.e. images of people wearing the different 3D clothing template model model at exact same pose.
We propose a semi-supervised learning framework that validates the physical plausibility of 3D deformation by matching with the prescribed body-to-cloth contact points and clothing to fit onto the unlabeled silhouette.
arXiv Detail & Related papers (2021-01-29T20:50:34Z) - BCNet: Learning Body and Cloth Shape from A Single Image [56.486796244320125]
We propose a layered garment representation on top of SMPL and novelly make the skinning weight of garment independent of the body mesh.
Compared with existing methods, our method can support more garment categories and recover more accurate geometry.
arXiv Detail & Related papers (2020-04-01T03:41:36Z) - Deep Fashion3D: A Dataset and Benchmark for 3D Garment Reconstruction
from Single Images [50.34202789543989]
Deep Fashion3D is the largest collection to date of 3D garment models.
It provides rich annotations including 3D feature lines, 3D body pose and the corresponded multi-view real images.
A novel adaptable template is proposed to enable the learning of all types of clothing in a single network.
arXiv Detail & Related papers (2020-03-28T09:20:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.