GarmageNet: A Multimodal Generative Framework for Sewing Pattern Design and Generic Garment Modeling
- URL: http://arxiv.org/abs/2504.01483v3
- Date: Mon, 09 Jun 2025 11:06:19 GMT
- Title: GarmageNet: A Multimodal Generative Framework for Sewing Pattern Design and Generic Garment Modeling
- Authors: Siran Li, Chen Liu, Ruiyang Liu, Zhendong Wang, Gaofeng He, Yong-Lu Li, Xiaogang Jin, Huamin Wang,
- Abstract summary: GarmageNet is a generative framework that automates the creation of 2D sewing patterns.<n>Garmage is a novel garment representation that encodes each panel as a structured geometry image.<n>GarmageSet is a large-scale dataset comprising over 10,000 professionally designed garments.
- Score: 31.086617193645022
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Realistic digital garment modeling remains a labor-intensive task due to the intricate process of translating 2D sewing patterns into high-fidelity, simulation-ready 3D garments. We introduce GarmageNet, a unified generative framework that automates the creation of 2D sewing patterns, the construction of sewing relationships, and the synthesis of 3D garment initializations compatible with physics-based simulation. Central to our approach is Garmage, a novel garment representation that encodes each panel as a structured geometry image, effectively bridging the semantic and geometric gap between 2D structural patterns and 3D garment shapes. GarmageNet employs a latent diffusion transformer to synthesize panel-wise geometry images and integrates GarmageJigsaw, a neural module for predicting point-to-point sewing connections along panel contours. To support training and evaluation, we build GarmageSet, a large-scale dataset comprising over 10,000 professionally designed garments with detailed structural and style annotations. Our method demonstrates versatility and efficacy across multiple application scenarios, including scalable garment generation from multi-modal design concepts (text prompts, sketches, photographs), automatic modeling from raw flat sewing patterns, pattern recovery from unstructured point clouds, and progressive garment editing using conventional instructions-laying the foundation for fully automated, production-ready pipelines in digital fashion. Project page: https://style3d.github.io/garmagenet.
Related papers
- GarmentDiffusion: 3D Garment Sewing Pattern Generation with Multimodal Diffusion Transformers [9.228577662928673]
GarmentDiffusion is a new generative model capable of producing centimeter-precise, vectorized 3D sewing patterns from multimodal inputs.<n>Our method efficiently encodes 3D sewing pattern parameters into compact edge token representations.<n>With all combination of designs of our model, the sewing pattern generation speed is accelerated by 100 times compared to SewingGPT.
arXiv Detail & Related papers (2025-04-30T09:56:59Z) - GarmentX: Autoregressive Parametric Representations for High-Fidelity 3D Garment Generation [15.345904761472106]
GarmentX is a novel framework for generating diverse, high-fidelity, and wearable 3D garments from a single input image.
We introduce GarmentX dataset, a large-scale dataset of 378,682 garment parameter-image pairs.
arXiv Detail & Related papers (2025-04-29T04:15:33Z) - DiffusedWrinkles: A Diffusion-Based Model for Data-Driven Garment Animation [10.9550231281676]
We present a data-driven method for learning to generate animations of 3D garments using a 2D image diffusion model.<n>Our approach is able to synthesize high-quality 3D animations for a wide variety of garments and body shapes.
arXiv Detail & Related papers (2025-03-24T06:08:26Z) - SuperCarver: Texture-Consistent 3D Geometry Super-Resolution for High-Fidelity Surface Detail Generation [70.76810765911499]
SuperCarver is a 3D geometry framework specifically tailored for adding texture-consistent surface details to given coarse meshes.<n>To achieve geometric detail generation, we develop a deterministic prior-guided normal diffusion model fine-tuned on a dataset of paired low-poly and high-poly normal renderings.<n>To optimize mesh structures from potentially imperfect normal map predictions, we design a simple yet effective noise-resistant inverse rendering scheme.
arXiv Detail & Related papers (2025-03-12T14:38:45Z) - ChatGarment: Garment Estimation, Generation and Editing via Large Language Models [79.46056192947924]
ChatGarment is a novel approach that leverages large vision-language models (VLMs) to automate the estimation, generation, and editing of 3D garments.<n>It can estimate sewing patterns from in-the-wild images or sketches, generate them from text descriptions, and edit garments based on user instructions.
arXiv Detail & Related papers (2024-12-23T18:59:28Z) - Multimodal Latent Diffusion Model for Complex Sewing Pattern Generation [52.13927859375693]
We propose SewingLDM, a multi-modal generative model that generates sewing patterns controlled by text prompts, body shapes, and garment sketches.<n>To learn the sewing pattern distribution in the latent space, we design a two-step training strategy.<n> Comprehensive qualitative and quantitative experiments show the effectiveness of our proposed method.
arXiv Detail & Related papers (2024-12-19T02:05:28Z) - Design2GarmentCode: Turning Design Concepts to Tangible Garments Through Program Synthesis [27.1965932507935]
We propose a novel sewing pattern generation approach based on Large Multimodal Models (LMMs)<n>LMM offers an intuitive interface for interpreting diverse design inputs, while pattern-making programs could serve as well-structured and semantically meaningful representations of sewing patterns.<n>Our method can flexibly handle various complex design expressions such as images, textual descriptions, designer sketches, or their combinations, and convert them into size-precise sewing patterns with correct stitches.
arXiv Detail & Related papers (2024-12-11T18:26:45Z) - FabricDiffusion: High-Fidelity Texture Transfer for 3D Garments Generation from In-The-Wild Clothing Images [56.63824638417697]
FabricDiffusion is a method for transferring fabric textures from a single clothing image to 3D garments of arbitrary shapes.
We show that FabricDiffusion can transfer various features from a single clothing image including texture patterns, material properties, and detailed prints and logos.
arXiv Detail & Related papers (2024-10-02T17:57:12Z) - Gaussian Garments: Reconstructing Simulation-Ready Clothing with Photorealistic Appearance from Multi-View Video [66.98046635045685]
We introduce a novel approach for reconstructing realistic simulation-ready garment assets from multi-view videos.
Our method represents garments with a combination of a 3D mesh and a Gaussian texture that encodes both the color and high-frequency surface details.
This representation enables accurate registration of garment geometries to multi-view videos and helps disentangle albedo textures from lighting effects.
arXiv Detail & Related papers (2024-09-12T16:26:47Z) - GarmentCodeData: A Dataset of 3D Made-to-Measure Garments With Sewing Patterns [18.513707884523072]
We present the first large-scale synthetic dataset of 3D made-to-measure garments with sewing patterns.
GarmentCodeData contains 115,000 data points that cover a variety of designs in many common garment categories.
We propose an automatic, open-source 3D garment draping pipeline based on a fast XPBD simulator.
arXiv Detail & Related papers (2024-05-27T19:14:46Z) - DressCode: Autoregressively Sewing and Generating Garments from Text Guidance [61.48120090970027]
DressCode aims to democratize design for novices and offer immense potential in fashion design, virtual try-on, and digital human creation.
We first introduce SewingGPT, a GPT-based architecture integrating cross-attention with text-conditioned embedding to generate sewing patterns.
We then tailor a pre-trained Stable Diffusion to generate tile-based Physically-based Rendering (PBR) textures for the garments.
arXiv Detail & Related papers (2024-01-29T16:24:21Z) - Towards Garment Sewing Pattern Reconstruction from a Single Image [76.97825595711444]
Garment sewing pattern represents the intrinsic rest shape of a garment, and is the core for many applications like fashion design, virtual try-on, and digital avatars.
We first synthesize a versatile dataset, named SewFactory, which consists of around 1M images and ground-truth sewing patterns.
We then propose a two-level Transformer network called Sewformer, which significantly improves the sewing pattern prediction performance.
arXiv Detail & Related papers (2023-11-07T18:59:51Z) - Ghost on the Shell: An Expressive Representation of General 3D Shapes [97.76840585617907]
Meshes are appealing since they enable fast physics-based rendering with realistic material and lighting.
Recent work on reconstructing and statistically modeling 3D shapes has critiqued meshes as being topologically inflexible.
We parameterize open surfaces by defining a manifold signed distance field on watertight surfaces.
G-Shell achieves state-of-the-art performance on non-watertight mesh reconstruction and generation tasks.
arXiv Detail & Related papers (2023-10-23T17:59:52Z) - PersonalTailor: Personalizing 2D Pattern Design from 3D Garment Point
Clouds [59.617014796845865]
Garment pattern design aims to convert a 3D garment to the corresponding 2D panels and their sewing structure.
PersonalTailor is a personalized 2D pattern design method, where the user can input specific constraints or demands.
It first learns a multi-modal panel embeddings based on unsupervised cross-modal association and attentive fusion.
It then predicts a binary panel masks individually using a transformer encoder-decoder framework.
arXiv Detail & Related papers (2023-03-17T00:03:38Z) - Structure-Preserving 3D Garment Modeling with Neural Sewing Machines [190.70647799442565]
We propose a novel Neural Sewing Machine (NSM), a learning-based framework for structure-preserving 3D garment modeling.
NSM is capable of representing 3D garments under diverse garment shapes and topologies, realistically reconstructing 3D garments from 2D images with the preserved structure, and accurately manipulating the 3D garment categories, shapes, and topologies.
arXiv Detail & Related papers (2022-11-12T16:43:29Z) - Generating Datasets of 3D Garments with Sewing Patterns [10.729374293332281]
We create the first large-scale synthetic dataset of 3D garment models with their sewing patterns.
The dataset contains more than 20000 garment design variations produced from 19 different base types.
arXiv Detail & Related papers (2021-09-12T23:03:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.