Can neural networks predict dynamics they have never seen?
- URL: http://arxiv.org/abs/2111.06783v1
- Date: Fri, 12 Nov 2021 15:49:34 GMT
- Title: Can neural networks predict dynamics they have never seen?
- Authors: Anton Pershin, Cedric Beaume, Kuan Li, Steven M. Tobias
- Abstract summary: Neural networks have proven to be remarkably successful for a wide range of complicated tasks.
One of their successes is the skill in prediction of future dynamics given a suitable training set of data.
Previous studies have shown how Echo State Networks (ESNs), a subset of Recurrent Neural Networks, can successfully predict even chaotic systems for times longer than the Lyapunov time.
This study shows that, remarkably, ESNs can successfully predict dynamical behavior that is qualitatively different from any behavior contained in the training set.
- Score: 0.4588028371034407
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks have proven to be remarkably successful for a wide range of
complicated tasks, from image recognition and object detection to speech
recognition and machine translation. One of their successes is the skill in
prediction of future dynamics given a suitable training set of data. Previous
studies have shown how Echo State Networks (ESNs), a subset of Recurrent Neural
Networks, can successfully predict even chaotic systems for times longer than
the Lyapunov time. This study shows that, remarkably, ESNs can successfully
predict dynamical behavior that is qualitatively different from any behavior
contained in the training set. Evidence is provided for a fluid dynamics
problem where the flow can transition between laminar (ordered) and turbulent
(disordered) regimes. Despite being trained on the turbulent regime only, ESNs
are found to predict laminar behavior. Moreover, the statistics of
turbulent-to-laminar and laminar-to-turbulent transitions are also predicted
successfully, and the utility of ESNs in acting as an early-warning system for
transition is discussed. These results are expected to be widely applicable to
data-driven modelling of temporal behaviour in a range of physical, climate,
biological, ecological and finance models characterized by the presence of
tipping points and sudden transitions between several competing states.
Related papers
- Learning Continuous Network Emerging Dynamics from Scarce Observations
via Data-Adaptive Stochastic Processes [11.494631894700253]
We introduce ODE Processes for Network Dynamics (NDP4ND), a new class of processes governed by data-adaptive network dynamics.
We show that the proposed method has excellent data and computational efficiency, and can adapt to unseen network emerging dynamics.
arXiv Detail & Related papers (2023-10-25T08:44:05Z) - Deep Neural Networks Tend To Extrapolate Predictably [51.303814412294514]
neural network predictions tend to be unpredictable and overconfident when faced with out-of-distribution (OOD) inputs.
We observe that neural network predictions often tend towards a constant value as input data becomes increasingly OOD.
We show how one can leverage our insights in practice to enable risk-sensitive decision-making in the presence of OOD inputs.
arXiv Detail & Related papers (2023-10-02T03:25:32Z) - Variability of echo state network prediction horizon for partially
observed dynamical systems [0.0]
We study an echo state network (ESN) framework with partial state input with partial or full state output.
We show that the ESN is capable of making short-term predictions up to a few Lyapunov times.
We show that the ESN can effectively learn the system's dynamics even when trained with noisy numerical or experimental datasets.
arXiv Detail & Related papers (2023-06-19T09:37:18Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Do We Need an Encoder-Decoder to Model Dynamical Systems on Networks? [18.92828441607381]
We show that embeddings induce a model that fits observations well but simultaneously has incorrect dynamical behaviours.
We propose a simple embedding-free alternative based on parametrising two additive vector-field components.
arXiv Detail & Related papers (2023-05-20T12:41:47Z) - Brain-Inspired Spiking Neural Network for Online Unsupervised Time
Series Prediction [13.521272923545409]
We present a novel Continuous Learning-based Unsupervised Recurrent Spiking Neural Network Model (CLURSNN)
CLURSNN makes online predictions by reconstructing the underlying dynamical system using Random Delay Embedding.
We show that the proposed online time series prediction methodology outperforms state-of-the-art DNN models when predicting an evolving Lorenz63 dynamical system.
arXiv Detail & Related papers (2023-04-10T16:18:37Z) - Interpretable Self-Aware Neural Networks for Robust Trajectory
Prediction [50.79827516897913]
We introduce an interpretable paradigm for trajectory prediction that distributes the uncertainty among semantic concepts.
We validate our approach on real-world autonomous driving data, demonstrating superior performance over state-of-the-art baselines.
arXiv Detail & Related papers (2022-11-16T06:28:20Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Bayesian Neural Networks: An Introduction and Survey [22.018605089162204]
This article introduces Bayesian Neural Networks (BNNs) and the seminal research regarding their implementation.
Different approximate inference methods are compared, and used to highlight where future research can improve on current methods.
arXiv Detail & Related papers (2020-06-22T06:30:15Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.