Analytically Integratable Zero-restlength Springs for Capturing Dynamic
Modes unrepresented by Quasistatic Neural Networks
- URL: http://arxiv.org/abs/2201.10122v1
- Date: Tue, 25 Jan 2022 06:44:15 GMT
- Title: Analytically Integratable Zero-restlength Springs for Capturing Dynamic
Modes unrepresented by Quasistatic Neural Networks
- Authors: Yongxu Jin, Yushan Han, Zhenglin Geng, Joseph Teran, Ronald Fedkiw
- Abstract summary: We present a novel paradigm for modeling certain types of dynamic simulation in real-time with the aid of neural networks.
We augment our quasistatic neural network (QNN) inference with a (real-time) dynamic simulation layer.
We demonstrate that the spring parameters can be robustly learned from a surprisingly small amount of dynamic simulation data.
- Score: 6.601755525003559
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a novel paradigm for modeling certain types of dynamic simulation
in real-time with the aid of neural networks. In order to significantly reduce
the requirements on data (especially time-dependent data), as well as decrease
generalization error, our approach utilizes a data-driven neural network only
to capture quasistatic information (instead of dynamic or time-dependent
information). Subsequently, we augment our quasistatic neural network (QNN)
inference with a (real-time) dynamic simulation layer. Our key insight is that
the dynamic modes lost when using a QNN approximation can be captured with a
quite simple (and decoupled) zero-restlength spring model, which can be
integrated analytically (as opposed to numerically) and thus has no time-step
stability restrictions. Additionally, we demonstrate that the spring
constitutive parameters can be robustly learned from a surprisingly small
amount of dynamic simulation data. Although we illustrate the efficacy of our
approach by considering soft-tissue dynamics on animated human bodies, the
paradigm is extensible to many different simulation frameworks.
Related papers
- A Neural-Network-Based Approach for Loose-Fitting Clothing [2.910739621411222]
We show how to approximate dynamic modes in loose-fitting clothing using a real-time numerical algorithm.
We also use skinning to reconstruct a rough approximation to a desirable mesh.
In contrast to recurrent neural networks that require a plethora of training data, QNNs perform well with significantly less training data.
arXiv Detail & Related papers (2024-04-25T05:52:20Z) - On the Trade-off Between Efficiency and Precision of Neural Abstraction [62.046646433536104]
Neural abstractions have been recently introduced as formal approximations of complex, nonlinear dynamical models.
We employ formal inductive synthesis procedures to generate neural abstractions that result in dynamical models with these semantics.
arXiv Detail & Related papers (2023-07-28T13:22:32Z) - Do We Need an Encoder-Decoder to Model Dynamical Systems on Networks? [18.92828441607381]
We show that embeddings induce a model that fits observations well but simultaneously has incorrect dynamical behaviours.
We propose a simple embedding-free alternative based on parametrising two additive vector-field components.
arXiv Detail & Related papers (2023-05-20T12:41:47Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Learning Stochastic Dynamics with Statistics-Informed Neural Network [0.4297070083645049]
We introduce a machine-learning framework named statistics-informed neural network (SINN) for learning dynamics from data.
We devise mechanisms for training the neural network model to reproduce the correct emphstatistical behavior of a target process.
We show that the obtained reduced-order model can be trained on temporally coarse-grained data and hence is well suited for rare-event simulations.
arXiv Detail & Related papers (2022-02-24T18:21:01Z) - A Note on Learning Rare Events in Molecular Dynamics using LSTM and
Transformer [4.80427355202687]
Recently successful examples on learning slow dynamics by LSTM are given with simulation data of low dimensional reaction coordinate.
We show that the following three key factors significantly affect the performance of language model learning, namely dimensionality of reaction coordinates, temporal resolution and state partition.
arXiv Detail & Related papers (2021-07-14T09:26:36Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Stochastic Recurrent Neural Network for Multistep Time Series
Forecasting [0.0]
We leverage advances in deep generative models and the concept of state space models to propose an adaptation of the recurrent neural network for time series forecasting.
Our model preserves the architectural workings of a recurrent neural network for which all relevant information is encapsulated in its hidden states, and this flexibility allows our model to be easily integrated into any deep architecture for sequential modelling.
arXiv Detail & Related papers (2021-04-26T01:43:43Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.