Newton-Cotes Graph Neural Networks: On the Time Evolution of Dynamic
Systems
- URL: http://arxiv.org/abs/2305.14642v3
- Date: Fri, 20 Oct 2023 01:32:45 GMT
- Title: Newton-Cotes Graph Neural Networks: On the Time Evolution of Dynamic
Systems
- Authors: Lingbing Guo, Weiqing Wang, Zhuo Chen, Ningyu Zhang, Zequn Sun, Yixuan
Lai, Qiang Zhang, and Huajun Chen
- Abstract summary: We propose a new approach to predict the integration based on several velocity estimations with Newton-Cotes formulas.
Experiments on several benchmarks empirically demonstrate consistent and significant improvement compared with the state-of-the-art methods.
- Score: 49.50674348130157
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reasoning system dynamics is one of the most important analytical approaches
for many scientific studies. With the initial state of a system as input, the
recent graph neural networks (GNNs)-based methods are capable of predicting the
future state distant in time with high accuracy. Although these methods have
diverse designs in modeling the coordinates and interacting forces of the
system, we show that they actually share a common paradigm that learns the
integration of the velocity over the interval between the initial and terminal
coordinates. However, their integrand is constant w.r.t. time. Inspired by this
observation, we propose a new approach to predict the integration based on
several velocity estimations with Newton-Cotes formulas and prove its
effectiveness theoretically. Extensive experiments on several benchmarks
empirically demonstrate consistent and significant improvement compared with
the state-of-the-art methods.
Related papers
- PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - GDBN: a Graph Neural Network Approach to Dynamic Bayesian Network [7.876789380671075]
We propose a graph neural network approach with score-based method aiming at learning a sparse DAG.
We demonstrate methods with graph neural network significantly outperformed other state-of-the-art methods with dynamic bayesian networking inference.
arXiv Detail & Related papers (2023-01-28T02:49:13Z) - On Fast Simulation of Dynamical System with Neural Vector Enhanced
Numerical Solver [59.13397937903832]
We introduce a deep learning-based corrector called Neural Vector (NeurVec)
NeurVec can compensate for integration errors and enable larger time step sizes in simulations.
Our experiments on a variety of complex dynamical system benchmarks demonstrate that NeurVec exhibits remarkable generalization capability.
arXiv Detail & Related papers (2022-08-07T09:02:18Z) - Learning effective dynamics from data-driven stochastic systems [2.4578723416255754]
This work is devoted to investigating the effective dynamics for slow-fast dynamical systems.
We propose a novel algorithm including a neural network called Auto-SDE to learn in slow manifold.
arXiv Detail & Related papers (2022-05-09T09:56:58Z) - Neural Dynamical Systems: Balancing Structure and Flexibility in
Physical Prediction [14.788494279754481]
We introduce Neural Dynamical Systems (NDS), a method of learning dynamical models in various gray-box settings.
NDS uses neural networks to estimate free parameters of the system, predicts residual terms, and numerically integrates over time to predict future states.
arXiv Detail & Related papers (2020-06-23T00:50:48Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Variational Integrator Graph Networks for Learning Energy Conserving
Dynamical Systems [1.2522889958051286]
Recent advances show that neural networks embedded with physics-informed priors significantly outperform vanilla neural networks in learning.
We propose a novel method that unifies the strengths of existing approaches by combining an energy constraint, high-order variational, symplectic variational and graph neural networks.
arXiv Detail & Related papers (2020-04-28T17:42:47Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.