Fine-grained differentiable physics: a yarn-level model for fabrics
- URL: http://arxiv.org/abs/2202.00504v1
- Date: Tue, 1 Feb 2022 16:01:01 GMT
- Title: Fine-grained differentiable physics: a yarn-level model for fabrics
- Authors: Deshan Gong, Zhanxing Zhu, Andrew J.Bulpitt, He Wang
- Abstract summary: Differentiable physics modeling combines physics models with gradient-based learning to provide model explicability and data efficiency.
We propose a new differentiable fabrics model for composite materials such as cloths.
- Score: 33.01541119342456
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Differentiable physics modeling combines physics models with gradient-based
learning to provide model explicability and data efficiency. It has been used
to learn dynamics, solve inverse problems and facilitate design, and is at its
inception of impact. Current successes have concentrated on general physics
models such as rigid bodies, deformable sheets, etc., assuming relatively
simple structures and forces. Their granularity is intrinsically coarse and
therefore incapable of modelling complex physical phenomena. Fine-grained
models are still to be developed to incorporate sophisticated material
structures and force interactions with gradient-based learning. Following this
motivation, we propose a new differentiable fabrics model for composite
materials such as cloths, where we dive into the granularity of yarns and model
individual yarn physics and yarn-to-yarn interactions. To this end, we propose
several differentiable forces, whose counterparts in empirical physics are
indifferentiable, to facilitate gradient-based learning. These forces, albeit
applied to cloths, are ubiquitous in various physical systems. Through
comprehensive evaluation and comparison, we demonstrate our model's
explicability in learning meaningful physical parameters, versatility in
incorporating complex physical structures and heterogeneous materials,
data-efficiency in learning, and high-fidelity in capturing subtle dynamics.
Related papers
- PhysRig: Differentiable Physics-Based Skinning and Rigging Framework for Realistic Articulated Object Modeling [36.27177429446227]
Skinning and rigging are fundamental components in animation, articulated object reconstruction, motion transfer, and 4D generation.<n>Existing approaches predominantly rely on Linear Blend Skinning (LBS), due to its simplicity and differentiability.<n>We propose PhysRig: a differentiable physics-based skinning and rigging framework.
arXiv Detail & Related papers (2025-06-26T01:58:09Z) - PhysGaia: A Physics-Aware Dataset of Multi-Body Interactions for Dynamic Novel View Synthesis [62.283499219361595]
PhysGaia is a physics-aware dataset specifically designed for Dynamic Novel View Synthesis (DyNVS)<n>Our dataset provides complex dynamic scenarios with rich interactions among multiple objects.<n>PhysGaia will significantly advance research in dynamic view synthesis, physics-based scene understanding, and deep learning models integrated with physical simulation.
arXiv Detail & Related papers (2025-06-03T12:19:18Z) - ContPhy: Continuum Physical Concept Learning and Reasoning from Videos [86.63174804149216]
ContPhy is a novel benchmark for assessing machine physical commonsense.
We evaluated a range of AI models and found that they still struggle to achieve satisfactory performance on ContPhy.
We also introduce an oracle model (ContPRO) that marries the particle-based physical dynamic models with the recent large language models.
arXiv Detail & Related papers (2024-02-09T01:09:21Z) - Physics-Encoded Graph Neural Networks for Deformation Prediction under
Contact [87.69278096528156]
In robotics, it's crucial to understand object deformation during tactile interactions.
We introduce a method using Physics-Encoded Graph Neural Networks (GNNs) for such predictions.
We've made our code and dataset public to advance research in robotic simulation and grasping.
arXiv Detail & Related papers (2024-02-05T19:21:52Z) - MeLM, a generative pretrained language modeling framework that solves
forward and inverse mechanics problems [0.0]
We report a flexible multi-modal mechanics language model, MeLM, applied to solve various nonlinear forward and inverse problems.
The framework is applied to various examples including bio-inspired hierarchical honeycomb design and carbon nanotube mechanics.
arXiv Detail & Related papers (2023-06-30T10:28:20Z) - Latent Traversals in Generative Models as Potential Flows [113.4232528843775]
We propose to model latent structures with a learned dynamic potential landscape.
Inspired by physics, optimal transport, and neuroscience, these potential landscapes are learned as physically realistic partial differential equations.
Our method achieves both more qualitatively and quantitatively disentangled trajectories than state-of-the-art baselines.
arXiv Detail & Related papers (2023-04-25T15:53:45Z) - PhysGraph: Physics-Based Integration Using Graph Neural Networks [9.016253794897874]
We focus on the detail enhancement of coarse clothing geometry which has many applications including computer games, virtual reality and virtual try-on.
Our contribution is based on a simple observation: evaluating forces is computationally relatively cheap for traditional simulation methods.
We demonstrate that this idea leads to a learnable module that can be trained on basic internal forces of small mesh patches.
arXiv Detail & Related papers (2023-01-27T16:47:10Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Dynamic Visual Reasoning by Learning Differentiable Physics Models from
Video and Language [92.7638697243969]
We propose a unified framework that can jointly learn visual concepts and infer physics models of objects from videos and language.
This is achieved by seamlessly integrating three components: a visual perception module, a concept learner, and a differentiable physics engine.
arXiv Detail & Related papers (2021-10-28T17:59:13Z) - Physics-Guided Deep Learning for Dynamical Systems: A survey [5.733401663293044]
Traditional physics-based models are interpretable but rely on rigid assumptions.
Deep learning provides novel alternatives for efficiently recognizing complex patterns and emulating nonlinear dynamics.
It aims to take the best from both physics-based modeling and state-of-the-art DL models to better solve scientific problems.
arXiv Detail & Related papers (2021-07-02T20:59:03Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Augmenting Physical Models with Deep Networks for Complex Dynamics
Forecasting [34.61959169976758]
APHYNITY is a principled approach for augmenting incomplete physical dynamics described by differential equations with deep data-driven models.
It consists in decomposing the dynamics into two components: a physical component accounting for the dynamics for which we have some prior knowledge, and a data-driven component accounting for errors of the physical model.
arXiv Detail & Related papers (2020-10-09T09:31:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.