A Physics-embedded Deep Learning Framework for Cloth Simulation
- URL: http://arxiv.org/abs/2403.12820v2
- Date: Wed, 27 Mar 2024 07:35:47 GMT
- Title: A Physics-embedded Deep Learning Framework for Cloth Simulation
- Authors: Zhiwei Zhao,
- Abstract summary: This paper proposes a physics-embedded learning framework that directly encodes physical features of cloth simulation.
The framework can also integrate with other external forces and collision handling through either traditional simulators or sub neural networks.
- Score: 6.8806198396336935
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Delicate cloth simulations have long been desired in computer graphics. Various methods were proposed to improve engaged force interactions, collision handling, and numerical integrations. Deep learning has the potential to achieve fast and real-time simulation, but common neural network structures often demand many parameters to capture cloth dynamics. This paper proposes a physics-embedded learning framework that directly encodes physical features of cloth simulation. The convolutional neural network is used to represent spatial correlations of the mass-spring system, after which three branches are designed to learn linear, nonlinear, and time derivate features of cloth physics. The framework can also integrate with other external forces and collision handling through either traditional simulators or sub neural networks. The model is tested across different cloth animation cases, without training with new data. Agreement with baselines and predictive realism successfully validate its generalization ability. Inference efficiency of the proposed model also defeats traditional physics simulation. This framework is also designed to easily integrate with other visual refinement techniques like wrinkle carving, which leaves significant chances to incorporate prevailing macing learning techniques in 3D cloth amination.
Related papers
- Learned Neural Physics Simulation for Articulated 3D Human Pose Reconstruction [30.51621591645056]
We propose a novel neural network approach, LARP, to model the dynamics of articulated human motion with contact.
Our neural architecture supports features typically found in traditional physics simulators.
To demonstrate the value of LARP we use it as a drop-in replacement for a state of the art classical non-differentiable simulator in an existing video-based reconstruction framework.
arXiv Detail & Related papers (2024-10-15T19:42:45Z) - Neural Material Adaptor for Visual Grounding of Intrinsic Dynamics [48.99021224773799]
We propose the Neural Material Adaptor (NeuMA), which integrates existing physical laws with learned corrections.
We also propose Particle-GS, a particle-driven 3D Gaussian Splatting variant that bridges simulation and observed images.
arXiv Detail & Related papers (2024-10-10T17:43:36Z) - Latent Intuitive Physics: Learning to Transfer Hidden Physics from A 3D Video [58.043569985784806]
We introduce latent intuitive physics, a transfer learning framework for physics simulation.
It can infer hidden properties of fluids from a single 3D video and simulate the observed fluid in novel scenes.
We validate our model in three ways: (i) novel scene simulation with the learned visual-world physics, (ii) future prediction of the observed fluid dynamics, and (iii) supervised particle simulation.
arXiv Detail & Related papers (2024-06-18T16:37:44Z) - NeuralClothSim: Neural Deformation Fields Meet the Thin Shell Theory [70.10550467873499]
We propose NeuralClothSim, a new quasistatic cloth simulator using thin shells.
Our memory-efficient solver operates on a new continuous coordinate-based surface representation called neural deformation fields.
arXiv Detail & Related papers (2023-08-24T17:59:54Z) - PhysGraph: Physics-Based Integration Using Graph Neural Networks [9.016253794897874]
We focus on the detail enhancement of coarse clothing geometry which has many applications including computer games, virtual reality and virtual try-on.
Our contribution is based on a simple observation: evaluating forces is computationally relatively cheap for traditional simulation methods.
We demonstrate that this idea leads to a learnable module that can be trained on basic internal forces of small mesh patches.
arXiv Detail & Related papers (2023-01-27T16:47:10Z) - Neural Cloth Simulation [41.42019559241777]
We present a framework for the garment animation problem through unsupervised deep learning inspired in physically based simulation.
We propose the first methodology able to learn realistic cloth dynamics unsupervisedly.
We show it also allows to control the level of motion in the predictions.
arXiv Detail & Related papers (2022-12-13T16:05:59Z) - Scene Synthesis via Uncertainty-Driven Attribute Synchronization [52.31834816911887]
This paper introduces a novel neural scene synthesis approach that can capture diverse feature patterns of 3D scenes.
Our method combines the strength of both neural network-based and conventional scene synthesis approaches.
arXiv Detail & Related papers (2021-08-30T19:45:07Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Learning Mesh-Based Simulation with Graph Networks [20.29893312074383]
We introduce MeshGraphNets, a framework for learning mesh-based simulations using graph neural networks.
Our results show it can accurately predict the dynamics of a wide range of physical systems, including aerodynamics, structural mechanics, and cloth.
arXiv Detail & Related papers (2020-10-07T13:34:49Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.