Long-term prediction of chaotic systems with recurrent neural networks
- URL: http://arxiv.org/abs/2004.01258v1
- Date: Fri, 6 Mar 2020 18:59:44 GMT
- Title: Long-term prediction of chaotic systems with recurrent neural networks
- Authors: Huawei Fan, Junjie Jiang, Chun Zhang, Xingang Wang, and Ying-Cheng Lai
- Abstract summary: Reservoir computing systems have recently been exploited for model-free, data-based prediction of the state evolution of a variety of chaotic dynamical systems.
We articulate a scheme incorporating time-dependent but sparse data inputs into reservoir computing and demonstrate that such rare "updates" of the actual state practically enable an arbitrarily long prediction horizon.
- Score: 1.1499361198674167
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reservoir computing systems, a class of recurrent neural networks, have
recently been exploited for model-free, data-based prediction of the state
evolution of a variety of chaotic dynamical systems. The prediction horizon
demonstrated has been about half dozen Lyapunov time. Is it possible to
significantly extend the prediction time beyond what has been achieved so far?
We articulate a scheme incorporating time-dependent but sparse data inputs into
reservoir computing and demonstrate that such rare "updates" of the actual
state practically enable an arbitrarily long prediction horizon for a variety
of chaotic systems. A physical understanding based on the theory of temporal
synchronization is developed.
Related papers
- Constraining Chaos: Enforcing dynamical invariants in the training of
recurrent neural networks [0.0]
We introduce a novel training method for machine learning based forecasting methods for chaotic dynamical systems.
The training enforces dynamical invariants--such as the Lyapunov exponent spectrum and fractal dimension--in the systems of interest, enabling longer and more stable forecasts when operating with limited data.
arXiv Detail & Related papers (2023-04-24T00:33:47Z) - Deep learning delay coordinate dynamics for chaotic attractors from
partial observable data [0.0]
We utilize deep artificial neural networks to learn discrete discrete time maps and continuous time flows of the partial state.
We demonstrate the capacity of deep ANNs to predict chaotic behavior from a scalar observation on a manifold of dimension three via the Lorenz system.
arXiv Detail & Related papers (2022-11-20T19:25:02Z) - Interpretable Self-Aware Neural Networks for Robust Trajectory
Prediction [50.79827516897913]
We introduce an interpretable paradigm for trajectory prediction that distributes the uncertainty among semantic concepts.
We validate our approach on real-world autonomous driving data, demonstrating superior performance over state-of-the-art baselines.
arXiv Detail & Related papers (2022-11-16T06:28:20Z) - Delay Embedded Echo-State Network: A Predictor for Partially Observed
Systems [0.0]
A predictor for partial observations is developed using an echo-state network (ESN) and time delay embedding of the partially observed state.
The proposed method is theoretically justified with Taken's embedding theorem and strong observability of a nonlinear system.
arXiv Detail & Related papers (2022-11-11T04:13:55Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Forecasting large-scale circulation regimes using deformable
convolutional neural networks and global spatiotemporal climate data [86.1450118623908]
We investigate a supervised machine learning approach based on deformable convolutional neural networks (deCNNs)
We forecast the North Atlantic-European weather regimes during extended boreal winter for 1 to 15 days into the future.
Due to its wider field of view, we also observe deCNN achieving considerably better performance than regular convolutional neural networks at lead times beyond 5-6 days.
arXiv Detail & Related papers (2022-02-10T11:37:00Z) - Probabilistic Time Series Forecasting with Implicit Quantile Networks [0.7249731529275341]
We combine an autoregressive recurrent neural network to model temporal dynamics with Implicit Quantile Networks to learn a large class of distributions over a time-series target.
Our approach is favorable in terms of point-wise prediction accuracy as well as on estimating the underlying temporal distribution.
arXiv Detail & Related papers (2021-07-08T10:37:24Z) - Stochastic Recurrent Neural Network for Multistep Time Series
Forecasting [0.0]
We leverage advances in deep generative models and the concept of state space models to propose an adaptation of the recurrent neural network for time series forecasting.
Our model preserves the architectural workings of a recurrent neural network for which all relevant information is encapsulated in its hidden states, and this flexibility allows our model to be easily integrated into any deep architecture for sequential modelling.
arXiv Detail & Related papers (2021-04-26T01:43:43Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.