Neural Garment Dynamics via Manifold-Aware Transformers
- URL: http://arxiv.org/abs/2407.06101v1
- Date: Mon, 13 May 2024 11:05:52 GMT
- Title: Neural Garment Dynamics via Manifold-Aware Transformers
- Authors: Peizhuo Li, Tuanfeng Y. Wang, Timur Levent Kesdogan, Duygu Ceylan, Olga Sorkine-Hornung,
- Abstract summary: We take a different approach and model the dynamics of a garment by exploiting its local interactions with the underlying human body.
Specifically, as the body moves, we detect local garment-body collisions, which drive the deformation of the garment.
At the core of our approach is a mesh-agnostic garment representation and a manifold-aware transformer network design.
- Score: 26.01911475040001
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Data driven and learning based solutions for modeling dynamic garments have significantly advanced, especially in the context of digital humans. However, existing approaches often focus on modeling garments with respect to a fixed parametric human body model and are limited to garment geometries that were seen during training. In this work, we take a different approach and model the dynamics of a garment by exploiting its local interactions with the underlying human body. Specifically, as the body moves, we detect local garment-body collisions, which drive the deformation of the garment. At the core of our approach is a mesh-agnostic garment representation and a manifold-aware transformer network design, which together enable our method to generalize to unseen garment and body geometries. We evaluate our approach on a wide variety of garment types and motion sequences and provide competitive qualitative and quantitative results with respect to the state of the art.
Related papers
- HUMOS: Human Motion Model Conditioned on Body Shape [54.20419874234214]
We introduce a new approach to develop a generative motion model based on body shape.
We show that it's possible to train this model using unpaired data.
The resulting model generates diverse, physically plausible, and dynamically stable human motions.
arXiv Detail & Related papers (2024-09-05T23:50:57Z) - PICA: Physics-Integrated Clothed Avatar [30.277983921620663]
We introduce PICA, a novel representation for high-fidelity animatable clothed human avatars with physics-accurate dynamics, even for loose clothing.
Our method achieves high-fidelity rendering of human bodies in complex and novel driving poses, significantly outperforming previous methods under the same settings.
arXiv Detail & Related papers (2024-07-07T10:23:21Z) - Neural-ABC: Neural Parametric Models for Articulated Body with Clothes [29.04941764336255]
We introduce Neural-ABC, a novel model that can represent clothed human bodies with disentangled latent spaces for identity, clothing, shape, and pose.
Our model excels at disentangling clothing and identity in different shape and poses while preserving the style of the clothing.
Compared to other state-of-the-art parametric models, Neural-ABC demonstrates powerful advantages in the reconstruction of clothed human bodies.
arXiv Detail & Related papers (2024-04-06T16:29:10Z) - AniDress: Animatable Loose-Dressed Avatar from Sparse Views Using
Garment Rigging Model [58.035758145894846]
We introduce AniDress, a novel method for generating animatable human avatars in loose clothes using very sparse multi-view videos.
A pose-driven deformable neural radiance field conditioned on both body and garment motions is introduced, providing explicit control of both parts.
Our method is able to render natural garment dynamics that deviate highly from the body and well to generalize to both unseen views and poses.
arXiv Detail & Related papers (2024-01-27T08:48:18Z) - Towards Loose-Fitting Garment Animation via Generative Model of
Deformation Decomposition [4.627632792164547]
We develop a garment generative model based on deformation decomposition to efficiently simulate loose garment deformation without using linear skinning.
We demonstrate our method outperforms state-of-the-art data-driven alternatives through extensive experiments and show qualitative and quantitative analysis of results.
arXiv Detail & Related papers (2023-12-22T11:26:51Z) - Garment Recovery with Shape and Deformation Priors [51.41962835642731]
We propose a method that delivers realistic garment models from real-world images, regardless of garment shape or deformation.
Not only does our approach recover the garment geometry accurately, it also yields models that can be directly used by downstream applications such as animation and simulation.
arXiv Detail & Related papers (2023-11-17T07:06:21Z) - SwinGar: Spectrum-Inspired Neural Dynamic Deformation for Free-Swinging
Garments [6.821050909555717]
We present a spectrum-inspired learning-based approach for generating clothing deformations with dynamic effects and personalized details.
Our proposed method overcomes limitations by providing a unified framework that predicts dynamic behavior for different garments.
We develop a dynamic clothing deformation estimator that integrates frequency-controllable attention mechanisms with long short-term memory.
arXiv Detail & Related papers (2023-08-05T09:09:50Z) - HOOD: Hierarchical Graphs for Generalized Modelling of Clothing Dynamics [84.29846699151288]
Our method is agnostic to body shape and applies to tight-fitting garments as well as loose, free-flowing clothing.
As one key contribution, we propose a hierarchical message-passing scheme that efficiently propagates stiff stretching modes.
arXiv Detail & Related papers (2022-12-14T14:24:00Z) - DIG: Draping Implicit Garment over the Human Body [56.68349332089129]
We propose an end-to-end differentiable pipeline that represents garments using implicit surfaces and learns a skinning field conditioned on shape and pose parameters of an articulated body model.
We show that our method, thanks to its end-to-end differentiability, allows to recover body and garments parameters jointly from image observations.
arXiv Detail & Related papers (2022-09-22T08:13:59Z) - Deep Physics-aware Inference of Cloth Deformation for Monocular Human
Performance Capture [84.73946704272113]
We show how integrating physics into the training process improves the learned cloth deformations and allows modeling clothing as a separate piece of geometry.
Our approach leads to a significant improvement over current state-of-the-art methods and is thus a clear step towards realistic monocular capture of the entire deforming surface of a human clothed.
arXiv Detail & Related papers (2020-11-25T16:46:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.