Learning ergodic averages in chaotic systems
- URL: http://arxiv.org/abs/2001.04027v2
- Date: Tue, 7 Apr 2020 11:50:08 GMT
- Title: Learning ergodic averages in chaotic systems
- Authors: Francisco Huhn, Luca Magri
- Abstract summary: We propose a machine learning method to predict the time average of a chaotic attractor.
The method is based on the hybrid echo state network (hESN)
- Score: 6.85316573653194
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a physics-informed machine learning method to predict the time
average of a chaotic attractor. The method is based on the hybrid echo state
network (hESN). We assume that the system is ergodic, so the time average is
equal to the ergodic average. Compared to conventional echo state networks
(ESN) (purely data-driven), the hESN uses additional information from an
incomplete, or imperfect, physical model. We evaluate the performance of the
hESN and compare it to that of an ESN. This approach is demonstrated on a
chaotic time-delayed thermoacoustic system, where the inclusion of a physical
model significantly improves the accuracy of the prediction, reducing the
relative error from 48% to 7%. This improvement is obtained at the low extra
cost of solving two ordinary differential equations. This framework shows the
potential of using machine learning techniques combined with prior physical
knowledge to improve the prediction of time-averaged quantities in chaotic
systems.
Related papers
- Neural Incremental Data Assimilation [8.817223931520381]
We introduce a deep learning approach where the physical system is modeled as a sequence of coarse-to-fine Gaussian prior distributions parametrized by a neural network.
This allows us to define an assimilation operator, which is trained in an end-to-end fashion to minimize the reconstruction error.
We illustrate our approach on chaotic dynamical physical systems with sparse observations, and compare it to traditional variational data assimilation methods.
arXiv Detail & Related papers (2024-06-21T11:42:55Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Deep Neural Networks Tend To Extrapolate Predictably [51.303814412294514]
neural network predictions tend to be unpredictable and overconfident when faced with out-of-distribution (OOD) inputs.
We observe that neural network predictions often tend towards a constant value as input data becomes increasingly OOD.
We show how one can leverage our insights in practice to enable risk-sensitive decision-making in the presence of OOD inputs.
arXiv Detail & Related papers (2023-10-02T03:25:32Z) - Unmatched uncertainty mitigation through neural network supported model
predictive control [7.036452261968766]
We utilize a deep neural network (DNN) as an oracle in the underlying optimization problem of learning based MPC (LBMPC)
We employ a dual-timescale adaptation mechanism, where the weights of the last layer of the neural network are updated in real time.
Results indicate that the proposed approach is implementable in real time and carries the theoretical guarantees of LBMPC.
arXiv Detail & Related papers (2023-04-22T04:49:48Z) - A Stable, Fast, and Fully Automatic Learning Algorithm for Predictive
Coding Networks [65.34977803841007]
Predictive coding networks are neuroscience-inspired models with roots in both Bayesian statistics and neuroscience.
We show how by simply changing the temporal scheduling of the update rule for the synaptic weights leads to an algorithm that is much more efficient and stable than the original one.
arXiv Detail & Related papers (2022-11-16T00:11:04Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Deep learning-enhanced ensemble-based data assimilation for
high-dimensional nonlinear dynamical systems [0.0]
Ensemble Kalman filter (EnKF) is a DA algorithm widely used in applications involving high-dimensional nonlinear dynamical systems.
In this work, we propose hybrid ensemble Kalman filter (H-EnKF), which is applied to a two-layer quasi-geostrophic flow system as a test case.
arXiv Detail & Related papers (2022-06-09T23:34:49Z) - Observation Error Covariance Specification in Dynamical Systems for Data
assimilation using Recurrent Neural Networks [0.5330240017302621]
We propose a data-driven approach based on long short term memory (LSTM) recurrent neural networks (RNN)
The proposed approach does not require any knowledge or assumption about prior error distribution.
We have compared the novel approach with two state-of-the-art covariance tuning algorithms, namely DI01 and D05.
arXiv Detail & Related papers (2021-11-11T20:23:00Z) - TELESTO: A Graph Neural Network Model for Anomaly Classification in
Cloud Services [77.454688257702]
Machine learning (ML) and artificial intelligence (AI) are applied on IT system operation and maintenance.
One direction aims at the recognition of re-occurring anomaly types to enable remediation automation.
We propose a method that is invariant to dimensionality changes of given data.
arXiv Detail & Related papers (2021-02-25T14:24:49Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Physics-Informed Echo State Networks [5.8010446129208155]
We propose a physics-informed Echo State Network (ESN) to predict the evolution of chaotic systems.
Compared to conventional ESNs, the physics-informed ESNs are trained to solve supervised learning tasks.
The proposed framework shows the potential of using machine learning combined with prior physical knowledge to improve the time-accurate prediction of chaotic systems.
arXiv Detail & Related papers (2020-10-31T11:47:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.