GarmageNet: A Dataset and Scalable Representation for Generic Garment Modeling
- URL: http://arxiv.org/abs/2504.01483v1
- Date: Wed, 02 Apr 2025 08:37:32 GMT
- Title: GarmageNet: A Dataset and Scalable Representation for Generic Garment Modeling
- Authors: Siran Li, Ruiyang Liu, Chen Liu, Zhendong Wang, Gaofeng He, Yong-Lu Li, Xiaogang Jin, Huamin Wang,
- Abstract summary: Garmage is a neural-network-and-CG-friendly representation for complex multi-layered garments.<n>GarmageNet produces detailed garments with body-conforming initial geometries and intricate sewing patterns.<n>We release an industrial-standard, large-scale, high-fidelity garment dataset.
- Score: 31.086617193645022
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: High-fidelity garment modeling remains challenging due to the lack of large-scale, high-quality datasets and efficient representations capable of handling non-watertight, multi-layer geometries. In this work, we introduce Garmage, a neural-network-and-CG-friendly garment representation that seamlessly encodes the accurate geometry and sewing pattern of complex multi-layered garments as a structured set of per-panel geometry images. As a dual-2D-3D representation, Garmage achieves an unprecedented integration of 2D image-based algorithms with 3D modeling workflows, enabling high fidelity, non-watertight, multi-layered garment geometries with direct compatibility for industrial-grade simulations.Built upon this representation, we present GarmageNet, a novel generation framework capable of producing detailed multi-layered garments with body-conforming initial geometries and intricate sewing patterns, based on user prompts or existing in-the-wild sewing patterns. Furthermore, we introduce a robust stitching algorithm that recovers per-vertex stitches, ensuring seamless integration into flexible simulation pipelines for downstream editing of sewing patterns, material properties, and dynamic simulations. Finally, we release an industrial-standard, large-scale, high-fidelity garment dataset featuring detailed annotations, vertex-wise correspondences, and a robust pipeline for converting unstructured production sewing patterns into GarmageNet standard structural assets, paving the way for large-scale, industrial-grade garment generation systems.
Related papers
- GarmentX: Autoregressive Parametric Representations for High-Fidelity 3D Garment Generation [15.345904761472106]
GarmentX is a novel framework for generating diverse, high-fidelity, and wearable 3D garments from a single input image.
We introduce GarmentX dataset, a large-scale dataset of 378,682 garment parameter-image pairs.
arXiv Detail & Related papers (2025-04-29T04:15:33Z) - DiffusedWrinkles: A Diffusion-Based Model for Data-Driven Garment Animation [10.9550231281676]
We present a data-driven method for learning to generate animations of 3D garments using a 2D image diffusion model.<n>Our approach is able to synthesize high-quality 3D animations for a wide variety of garments and body shapes.
arXiv Detail & Related papers (2025-03-24T06:08:26Z) - SuperCarver: Texture-Consistent 3D Geometry Super-Resolution for High-Fidelity Surface Detail Generation [70.76810765911499]
SuperCarver is a 3D geometry framework specifically tailored for adding texture-consistent surface details to given coarse meshes.<n>To achieve geometric detail generation, we develop a deterministic prior-guided normal diffusion model fine-tuned on a dataset of paired low-poly and high-poly normal renderings.<n>To optimize mesh structures from potentially imperfect normal map predictions, we design a simple yet effective noise-resistant inverse rendering scheme.
arXiv Detail & Related papers (2025-03-12T14:38:45Z) - Multimodal Latent Diffusion Model for Complex Sewing Pattern Generation [52.13927859375693]
We propose SewingLDM, a multi-modal generative model that generates sewing patterns controlled by text prompts, body shapes, and garment sketches.<n>To learn the sewing pattern distribution in the latent space, we design a two-step training strategy.<n> Comprehensive qualitative and quantitative experiments show the effectiveness of our proposed method.
arXiv Detail & Related papers (2024-12-19T02:05:28Z) - FabricDiffusion: High-Fidelity Texture Transfer for 3D Garments Generation from In-The-Wild Clothing Images [56.63824638417697]
FabricDiffusion is a method for transferring fabric textures from a single clothing image to 3D garments of arbitrary shapes.
We show that FabricDiffusion can transfer various features from a single clothing image including texture patterns, material properties, and detailed prints and logos.
arXiv Detail & Related papers (2024-10-02T17:57:12Z) - Gaussian Garments: Reconstructing Simulation-Ready Clothing with Photorealistic Appearance from Multi-View Video [66.98046635045685]
We introduce a novel approach for reconstructing realistic simulation-ready garment assets from multi-view videos.
Our method represents garments with a combination of a 3D mesh and a Gaussian texture that encodes both the color and high-frequency surface details.
This representation enables accurate registration of garment geometries to multi-view videos and helps disentangle albedo textures from lighting effects.
arXiv Detail & Related papers (2024-09-12T16:26:47Z) - GarmentCodeData: A Dataset of 3D Made-to-Measure Garments With Sewing Patterns [18.513707884523072]
We present the first large-scale synthetic dataset of 3D made-to-measure garments with sewing patterns.
GarmentCodeData contains 115,000 data points that cover a variety of designs in many common garment categories.
We propose an automatic, open-source 3D garment draping pipeline based on a fast XPBD simulator.
arXiv Detail & Related papers (2024-05-27T19:14:46Z) - Ghost on the Shell: An Expressive Representation of General 3D Shapes [97.76840585617907]
Meshes are appealing since they enable fast physics-based rendering with realistic material and lighting.
Recent work on reconstructing and statistically modeling 3D shapes has critiqued meshes as being topologically inflexible.
We parameterize open surfaces by defining a manifold signed distance field on watertight surfaces.
G-Shell achieves state-of-the-art performance on non-watertight mesh reconstruction and generation tasks.
arXiv Detail & Related papers (2023-10-23T17:59:52Z) - Structure-Preserving 3D Garment Modeling with Neural Sewing Machines [190.70647799442565]
We propose a novel Neural Sewing Machine (NSM), a learning-based framework for structure-preserving 3D garment modeling.
NSM is capable of representing 3D garments under diverse garment shapes and topologies, realistically reconstructing 3D garments from 2D images with the preserved structure, and accurately manipulating the 3D garment categories, shapes, and topologies.
arXiv Detail & Related papers (2022-11-12T16:43:29Z) - Generating Datasets of 3D Garments with Sewing Patterns [10.729374293332281]
We create the first large-scale synthetic dataset of 3D garment models with their sewing patterns.
The dataset contains more than 20000 garment design variations produced from 19 different base types.
arXiv Detail & Related papers (2021-09-12T23:03:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.