Physics-informed nonlinear vector autoregressive models for the prediction of dynamical systems
- URL: http://arxiv.org/abs/2407.18057v1
- Date: Thu, 25 Jul 2024 14:10:42 GMT
- Title: Physics-informed nonlinear vector autoregressive models for the prediction of dynamical systems
- Authors: James H. Adler, Samuel Hocking, Xiaozhe Hu, Shafiqul Islam,
- Abstract summary: We focus on one class of models called nonlinear vector autoregression (N VAR) to solve ordinary differential equations (ODEs)
Motivated by connections to numerical integration and physics-informed neural networks, we explicitly derive the physics-informed N VAR.
Because N VAR and piN VAR completely share their learned parameters, we propose an augmented procedure to jointly train the two models.
We evaluate the ability of the piN VAR model to predict solutions to various ODE systems, such as the undamped spring, a Lotka-Volterra predator-prey nonlinear model, and the chaotic Lorenz system.
- Score: 0.36248657646376703
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning techniques have recently been of great interest for solving differential equations. Training these models is classically a data-fitting task, but knowledge of the expression of the differential equation can be used to supplement the training objective, leading to the development of physics-informed scientific machine learning. In this article, we focus on one class of models called nonlinear vector autoregression (NVAR) to solve ordinary differential equations (ODEs). Motivated by connections to numerical integration and physics-informed neural networks, we explicitly derive the physics-informed NVAR (piNVAR) which enforces the right-hand side of the underlying differential equation regardless of NVAR construction. Because NVAR and piNVAR completely share their learned parameters, we propose an augmented procedure to jointly train the two models. Then, using both data-driven and ODE-driven metrics, we evaluate the ability of the piNVAR model to predict solutions to various ODE systems, such as the undamped spring, a Lotka-Volterra predator-prey nonlinear model, and the chaotic Lorenz system.
Related papers
- Deep Generative Modeling for Identification of Noisy, Non-Stationary Dynamical Systems [3.1484174280822845]
We focus on finding parsimonious ordinary differential equation (ODE) models for nonlinear, noisy, and non-autonomous dynamical systems.
Our method, dynamic SINDy, combines variational inference with SINDy (sparse identification of nonlinear dynamics) to model time-varying coefficients of sparse ODEs.
arXiv Detail & Related papers (2024-10-02T23:00:00Z) - Nonlinear Schrödinger Network [0.8249694498830558]
Deep neural networks (DNNs) have achieved exceptional performance across various fields by learning complex nonlinear mappings from large-scale datasets.
To address these issues, hybrid approaches that integrate physics with AI are gaining interest.
This paper introduces a novel physics-based AI model called the "Nonlinear Schr"odinger Network"
arXiv Detail & Related papers (2024-07-19T17:58:00Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Training Deep Surrogate Models with Large Scale Online Learning [48.7576911714538]
Deep learning algorithms have emerged as a viable alternative for obtaining fast solutions for PDEs.
Models are usually trained on synthetic data generated by solvers, stored on disk and read back for training.
It proposes an open source online training framework for deep surrogate models.
arXiv Detail & Related papers (2023-06-28T12:02:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Stretched and measured neural predictions of complex network dynamics [2.1024950052120417]
Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems.
A recently employed machine learning tool for studying dynamics is neural networks, which can be used for data-driven solution finding or discovery of differential equations.
We show that extending the model's generalizability beyond traditional statistical learning theory limits is feasible.
arXiv Detail & Related papers (2023-01-12T09:44:59Z) - Optimizing differential equations to fit data and predict outcomes [0.0]
Recent technical advances in automatic differentiation through numerical differential equation solvers potentially change the fitting process into a relatively easy problem.
This article illustrates how to overcome a variety of common challenges, using the classic ecological data for oscillations in hare and lynx populations.
arXiv Detail & Related papers (2022-04-16T16:08:08Z) - Gradient-Based Trajectory Optimization With Learned Dynamics [80.41791191022139]
We use machine learning techniques to learn a differentiable dynamics model of the system from data.
We show that a neural network can model highly nonlinear behaviors accurately for large time horizons.
In our hardware experiments, we demonstrate that our learned model can represent complex dynamics for both the Spot and Radio-controlled (RC) car.
arXiv Detail & Related papers (2022-04-09T22:07:34Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Artificial neural network as a universal model of nonlinear dynamical
systems [0.0]
The map is built as an artificial neural network whose weights encode a modeled system.
We consider the Lorenz system, the Roessler system and also Hindmarch-Rose neuron.
High similarity is observed for visual images of attractors, power spectra, bifurcation diagrams and Lyapunovs exponents.
arXiv Detail & Related papers (2021-03-06T16:02:41Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.