Constraining Chaos: Enforcing dynamical invariants in the training of
recurrent neural networks
- URL: http://arxiv.org/abs/2304.12865v1
- Date: Mon, 24 Apr 2023 00:33:47 GMT
- Title: Constraining Chaos: Enforcing dynamical invariants in the training of
recurrent neural networks
- Authors: Jason A. Platt and Stephen G. Penny and Timothy A. Smith and Tse-Chun
Chen and Henry D. I. Abarbanel
- Abstract summary: We introduce a novel training method for machine learning based forecasting methods for chaotic dynamical systems.
The training enforces dynamical invariants--such as the Lyapunov exponent spectrum and fractal dimension--in the systems of interest, enabling longer and more stable forecasts when operating with limited data.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Drawing on ergodic theory, we introduce a novel training method for machine
learning based forecasting methods for chaotic dynamical systems. The training
enforces dynamical invariants--such as the Lyapunov exponent spectrum and
fractal dimension--in the systems of interest, enabling longer and more stable
forecasts when operating with limited data. The technique is demonstrated in
detail using the recurrent neural network architecture of reservoir computing.
Results are given for the Lorenz 1996 chaotic dynamical system and a spectral
quasi-geostrophic model, both typical test cases for numerical weather
prediction.
Related papers
- Gradient-free training of recurrent neural networks [3.272216546040443]
We introduce a computational approach to construct all weights and biases of a recurrent neural network without using gradient-based methods.
The approach is based on a combination of random feature networks and Koopman operator theory for dynamical systems.
In computational experiments on time series, forecasting for chaotic dynamical systems, and control problems, we observe that the training time and forecasting accuracy of the recurrent neural networks we construct are improved.
arXiv Detail & Related papers (2024-10-30T21:24:34Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - A predictive physics-aware hybrid reduced order model for reacting flows [65.73506571113623]
A new hybrid predictive Reduced Order Model (ROM) is proposed to solve reacting flow problems.
The number of degrees of freedom is reduced from thousands of temporal points to a few POD modes with their corresponding temporal coefficients.
Two different deep learning architectures have been tested to predict the temporal coefficients.
arXiv Detail & Related papers (2023-01-24T08:39:20Z) - Stretched and measured neural predictions of complex network dynamics [2.1024950052120417]
Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems.
A recently employed machine learning tool for studying dynamics is neural networks, which can be used for data-driven solution finding or discovery of differential equations.
We show that extending the model's generalizability beyond traditional statistical learning theory limits is feasible.
arXiv Detail & Related papers (2023-01-12T09:44:59Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Locally-symplectic neural networks for learning volume-preserving
dynamics [0.0]
We propose locally-symplectic neural networks LocSympNets for learning volume-preserving dynamics.
The construction of LocSympNets stems from the theorem of local Hamiltonian description of the vector field of a volume-preserving dynamical system.
arXiv Detail & Related papers (2021-09-19T15:58:09Z) - Constrained Block Nonlinear Neural Dynamical Models [1.3163098563588727]
Neural network modules conditioned by known priors can be effectively trained and combined to represent systems with nonlinear dynamics.
The proposed method consists of neural network blocks that represent input, state, and output dynamics with constraints placed on the network weights and system variables.
We evaluate the performance of the proposed architecture and training methods on system identification tasks for three nonlinear systems.
arXiv Detail & Related papers (2021-01-06T04:27:54Z) - Neural Dynamic Mode Decomposition for End-to-End Modeling of Nonlinear
Dynamics [49.41640137945938]
We propose a neural dynamic mode decomposition for estimating a lift function based on neural networks.
With our proposed method, the forecast error is backpropagated through the neural networks and the spectral decomposition.
Our experiments demonstrate the effectiveness of our proposed method in terms of eigenvalue estimation and forecast performance.
arXiv Detail & Related papers (2020-12-11T08:34:26Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.