Generating Datasets of 3D Garments with Sewing Patterns
- URL: http://arxiv.org/abs/2109.05633v1
- Date: Sun, 12 Sep 2021 23:03:48 GMT
- Title: Generating Datasets of 3D Garments with Sewing Patterns
- Authors: Maria Korosteleva, Sung-Hee Lee
- Abstract summary: We create the first large-scale synthetic dataset of 3D garment models with their sewing patterns.
The dataset contains more than 20000 garment design variations produced from 19 different base types.
- Score: 10.729374293332281
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Garments are ubiquitous in both real and many of the virtual worlds. They are
highly deformable objects, exhibit an immense variety of designs and shapes,
and yet, most garments are created from a set of regularly shaped flat pieces.
Exploration of garment structure presents a peculiar case for an object
structure estimation task and might prove useful for downstream tasks of neural
3D garment modeling and reconstruction by providing strong prior on garment
shapes. To facilitate research in these directions, we propose a method for
generating large synthetic datasets of 3D garment designs and their sewing
patterns. Our method consists of a flexible description structure for
specifying parametric sewing pattern templates and the automatic generation
pipeline to produce garment 3D models with little-to-none manual intervention.
To add realism, the pipeline additionally creates corrupted versions of the
final meshes that imitate artifacts of 3D scanning.
With this pipeline, we created the first large-scale synthetic dataset of 3D
garment models with their sewing patterns. The dataset contains more than 20000
garment design variations produced from 19 different base types. Seven of these
garment types are specifically designed to target evaluation of the
generalization across garment sewing pattern topologies.
Related papers
- FabricDiffusion: High-Fidelity Texture Transfer for 3D Garments Generation from In-The-Wild Clothing Images [56.63824638417697]
FabricDiffusion is a method for transferring fabric textures from a single clothing image to 3D garments of arbitrary shapes.
We show that FabricDiffusion can transfer various features from a single clothing image including texture patterns, material properties, and detailed prints and logos.
arXiv Detail & Related papers (2024-10-02T17:57:12Z) - GarmentCodeData: A Dataset of 3D Made-to-Measure Garments With Sewing Patterns [18.513707884523072]
We present the first large-scale synthetic dataset of 3D made-to-measure garments with sewing patterns.
GarmentCodeData contains 115,000 data points that cover a variety of designs in many common garment categories.
We propose an automatic, open-source 3D garment draping pipeline based on a fast XPBD simulator.
arXiv Detail & Related papers (2024-05-27T19:14:46Z) - Garment3DGen: 3D Garment Stylization and Texture Generation [11.836357439129301]
Garment3DGen is a new method to synthesize 3D garment assets from a base mesh given a single input image as guidance.
We leverage the recent progress of image-to-3D diffusion methods to generate 3D garment geometries.
We generate high-fidelity texture maps that are globally and locally consistent and faithfully capture the input guidance.
arXiv Detail & Related papers (2024-03-27T17:59:33Z) - Towards Garment Sewing Pattern Reconstruction from a Single Image [76.97825595711444]
Garment sewing pattern represents the intrinsic rest shape of a garment, and is the core for many applications like fashion design, virtual try-on, and digital avatars.
We first synthesize a versatile dataset, named SewFactory, which consists of around 1M images and ground-truth sewing patterns.
We then propose a two-level Transformer network called Sewformer, which significantly improves the sewing pattern prediction performance.
arXiv Detail & Related papers (2023-11-07T18:59:51Z) - ISP: Multi-Layered Garment Draping with Implicit Sewing Patterns [57.176642106425895]
We introduce a garment representation model that addresses limitations of current approaches.
It is faster and yields higher quality reconstructions than purely implicit surface representations.
It supports rapid editing of garment shapes and texture by modifying individual 2D panels.
arXiv Detail & Related papers (2023-05-23T14:23:48Z) - PersonalTailor: Personalizing 2D Pattern Design from 3D Garment Point
Clouds [59.617014796845865]
Garment pattern design aims to convert a 3D garment to the corresponding 2D panels and their sewing structure.
PersonalTailor is a personalized 2D pattern design method, where the user can input specific constraints or demands.
It first learns a multi-modal panel embeddings based on unsupervised cross-modal association and attentive fusion.
It then predicts a binary panel masks individually using a transformer encoder-decoder framework.
arXiv Detail & Related papers (2023-03-17T00:03:38Z) - Structure-Preserving 3D Garment Modeling with Neural Sewing Machines [190.70647799442565]
We propose a novel Neural Sewing Machine (NSM), a learning-based framework for structure-preserving 3D garment modeling.
NSM is capable of representing 3D garments under diverse garment shapes and topologies, realistically reconstructing 3D garments from 2D images with the preserved structure, and accurately manipulating the 3D garment categories, shapes, and topologies.
arXiv Detail & Related papers (2022-11-12T16:43:29Z) - NeuralTailor: Reconstructing Sewing Pattern Structures from 3D Point
Clouds of Garments [7.331799534004012]
We propose to use a garment sewing pattern to facilitate the intrinsic garment shape estimation.
We introduce NeuralTailor, a novel architecture based on point-level attention for set regression with variable cardinality.
Our experiments show that NeuralTailor successfully reconstructs sewing patterns and generalizes to garment types with pattern topologies unseen during training.
arXiv Detail & Related papers (2022-01-31T08:33:49Z) - Robust 3D Garment Digitization from Monocular 2D Images for 3D Virtual
Try-On Systems [1.7394606468019056]
We develop a robust 3D garment digitization solution that can generalize well on real-world fashion catalog images.
To train the supervised deep networks for landmark prediction & texture inpainting tasks, we generated a large set of synthetic data.
We manually annotated a small set of fashion catalog images crawled from online fashion e-commerce platforms to finetune.
arXiv Detail & Related papers (2021-11-30T05:49:23Z) - Deep Fashion3D: A Dataset and Benchmark for 3D Garment Reconstruction
from Single Images [50.34202789543989]
Deep Fashion3D is the largest collection to date of 3D garment models.
It provides rich annotations including 3D feature lines, 3D body pose and the corresponded multi-view real images.
A novel adaptable template is proposed to enable the learning of all types of clothing in a single network.
arXiv Detail & Related papers (2020-03-28T09:20:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.