DeepIron: Predicting Unwarped Garment Texture from a Single Image
- URL: http://arxiv.org/abs/2310.15447v2
- Date: Thu, 26 Oct 2023 05:07:55 GMT
- Title: DeepIron: Predicting Unwarped Garment Texture from a Single Image
- Authors: Hyun-Song Kwon, Sung-Hee Lee
- Abstract summary: This paper presents a novel framework that reconstructs the texture map for 3D garments from a single image with pose.
A key component of our framework, the Texture Unwarper, infers the original texture image from the input clothing image.
By inferring the unwarped original texture of the input garment, our method helps reconstruct 3D garment models that can show high-quality texture images realistically deformed for new poses.
- Score: 9.427635404752934
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Realistic reconstruction of 3D clothing from an image has wide applications,
such as avatar creation and virtual try-on. This paper presents a novel
framework that reconstructs the texture map for 3D garments from a single image
with pose. Assuming that 3D garments are modeled by stitching 2D garment sewing
patterns, our specific goal is to generate a texture image for the sewing
patterns. A key component of our framework, the Texture Unwarper, infers the
original texture image from the input clothing image, which exhibits warping
and occlusion of texture due to the user's body shape and pose. The Texture
Unwarper effectively transforms between the input and output images by mapping
the latent spaces of the two images. By inferring the unwarped original texture
of the input garment, our method helps reconstruct 3D garment models that can
show high-quality texture images realistically deformed for new poses. We
validate the effectiveness of our approach through a comparison with other
methods and ablation studies.
Related papers
- FabricDiffusion: High-Fidelity Texture Transfer for 3D Garments Generation from In-The-Wild Clothing Images [56.63824638417697]
FabricDiffusion is a method for transferring fabric textures from a single clothing image to 3D garments of arbitrary shapes.
We show that FabricDiffusion can transfer various features from a single clothing image including texture patterns, material properties, and detailed prints and logos.
arXiv Detail & Related papers (2024-10-02T17:57:12Z) - TextureDreamer: Image-guided Texture Synthesis through Geometry-aware
Diffusion [64.49276500129092]
TextureDreamer is an image-guided texture synthesis method.
It can transfer relightable textures from a small number of input images to target 3D shapes across arbitrary categories.
arXiv Detail & Related papers (2024-01-17T18:55:49Z) - SPnet: Estimating Garment Sewing Patterns from a Single Image [10.604555099281173]
This paper presents a novel method for reconstructing 3D garment models from a single image of a posed user.
By inferring the fundamental shape of the garment through sewing patterns from a single image, we can generate 3D garments that can adaptively deform to arbitrary poses.
arXiv Detail & Related papers (2023-12-26T09:51:25Z) - Generating Animatable 3D Cartoon Faces from Single Portraits [51.15618892675337]
We present a novel framework to generate animatable 3D cartoon faces from a single portrait image.
We propose a two-stage reconstruction method to recover the 3D cartoon face with detailed texture.
Finally, we propose a semantic preserving face rigging method based on manually created templates and deformation transfer.
arXiv Detail & Related papers (2023-07-04T04:12:50Z) - TEXTure: Text-Guided Texturing of 3D Shapes [71.13116133846084]
We present TEXTure, a novel method for text-guided editing, editing, and transfer of textures for 3D shapes.
We define a trimap partitioning process that generates seamless 3D textures without requiring explicit surface textures.
arXiv Detail & Related papers (2023-02-03T13:18:45Z) - Structure-Preserving 3D Garment Modeling with Neural Sewing Machines [190.70647799442565]
We propose a novel Neural Sewing Machine (NSM), a learning-based framework for structure-preserving 3D garment modeling.
NSM is capable of representing 3D garments under diverse garment shapes and topologies, realistically reconstructing 3D garments from 2D images with the preserved structure, and accurately manipulating the 3D garment categories, shapes, and topologies.
arXiv Detail & Related papers (2022-11-12T16:43:29Z) - Robust 3D Garment Digitization from Monocular 2D Images for 3D Virtual
Try-On Systems [1.7394606468019056]
We develop a robust 3D garment digitization solution that can generalize well on real-world fashion catalog images.
To train the supervised deep networks for landmark prediction & texture inpainting tasks, we generated a large set of synthetic data.
We manually annotated a small set of fashion catalog images crawled from online fashion e-commerce platforms to finetune.
arXiv Detail & Related papers (2021-11-30T05:49:23Z) - Learning to Transfer Texture from Clothing Images to 3D Humans [50.838970996234465]
We present a method to automatically transfer textures of clothing images to 3D garments worn on top SMPL, in real time.
We first compute training pairs of images with aligned 3D garments using a custom non-rigid 3D to 2D registration method, which is accurate but slow.
Our model opens the door to applications such as virtual try-on, and allows for generation of 3D humans with varied textures which is necessary for learning.
arXiv Detail & Related papers (2020-03-04T12:53:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.