Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics
- URL: http://arxiv.org/abs/2304.14369v2
- Date: Fri, 16 Jun 2023 00:16:00 GMT
- Title: Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics
- Authors: Pingchuan Ma, Peter Yichen Chen, Bolei Deng, Joshua B. Tenenbaum, Tao
Du, Chuang Gan, Wojciech Matusik
- Abstract summary: Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
- Score: 97.38308257547186
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a hybrid neural network (NN) and PDE approach for learning
generalizable PDE dynamics from motion observations. Many NN approaches learn
an end-to-end model that implicitly models both the governing PDE and
constitutive models (or material models). Without explicit PDE knowledge, these
approaches cannot guarantee physical correctness and have limited
generalizability. We argue that the governing PDEs are often well-known and
should be explicitly enforced rather than learned. Instead, constitutive models
are particularly suitable for learning due to their data-fitting nature. To
this end, we introduce a new framework termed "Neural Constitutive Laws"
(NCLaw), which utilizes a network architecture that strictly guarantees
standard constitutive priors, including rotation equivariance and undeformed
state equilibrium. We embed this network inside a differentiable simulation and
train the model by minimizing a loss function based on the difference between
the simulation and the motion observation. We validate NCLaw on various
large-deformation dynamical systems, ranging from solids to fluids. After
training on a single motion trajectory, our method generalizes to new
geometries, initial/boundary conditions, temporal ranges, and even
multi-physics systems. On these extremely out-of-distribution generalization
tasks, NCLaw is orders-of-magnitude more accurate than previous NN approaches.
Real-world experiments demonstrate our method's ability to learn constitutive
laws from videos.
Related papers
- PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Improving PINNs By Algebraic Inclusion of Boundary and Initial Conditions [0.1874930567916036]
"AI for Science" aims to solve fundamental scientific problems using AI techniques.
In this work we explore the possibility of changing the model being trained from being just a neural network to being a non-linear transformation of it.
This reduces the number of terms in the loss function than the standard PINN losses.
arXiv Detail & Related papers (2024-07-30T11:19:48Z) - Joint torques prediction of a robotic arm using neural networks [4.019105975232108]
Traditional approaches to deriving dynamic models are based on the application of Lagrangian or Newtonian mechanics.
A popular alternative is the application of Machine Learning (ML) techniques in the context of a "black-box" methodology.
This paper reports on our experience with this approach for a real-life 6 degrees of freedom (DoF) manipulator.
arXiv Detail & Related papers (2024-03-28T09:38:26Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - NeuralPDE: Modelling Dynamical Systems from Data [0.44259821861543996]
We propose NeuralPDE, a model which combines convolutional neural networks (CNNs) with differentiable ODE solvers to model dynamical systems.
We show that the Method of Lines used in standard PDE solvers can be represented using convolutions which makes CNNs the natural choice to parametrize arbitrary PDE dynamics.
Our model can be applied to any data without requiring any prior knowledge about the governing PDE.
arXiv Detail & Related papers (2021-11-15T10:59:52Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Go with the Flow: Adaptive Control for Neural ODEs [10.265713480189484]
We describe a new module called neurally controlled ODE (N-CODE) designed to improve the expressivity of NODEs.
N-CODE modules are dynamic variables governed by a trainable map from initial or current activation state.
A single module is sufficient for learning a distribution on non-autonomous flows that adaptively drive neural representations.
arXiv Detail & Related papers (2020-06-16T22:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.