Reservoir Computing with Diverse Timescales for Prediction of Multiscale
Dynamics
- URL: http://arxiv.org/abs/2108.09446v1
- Date: Sat, 21 Aug 2021 06:52:21 GMT
- Title: Reservoir Computing with Diverse Timescales for Prediction of Multiscale
Dynamics
- Authors: Gouhei Tanaka, Tadayoshi Matsumori, Hiroaki Yoshida, Kazuyuki Aihara
- Abstract summary: We propose a reservoir computing model with diverse timescales by using a recurrent network of heterogeneous leaky integrator neurons.
In prediction tasks with fast-slow chaotic dynamical systems, we demonstrate that the proposed model has a higher potential than the existing standard model.
Our analysis reveals that the timescales required for producing each component of target dynamics are appropriately and flexibly selected from the reservoir dynamics by model training.
- Score: 5.172455794487599
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning approaches have recently been leveraged as a substitute or
an aid for physical/mathematical modeling approaches to dynamical systems. To
develop an efficient machine learning method dedicated to modeling and
prediction of multiscale dynamics, we propose a reservoir computing model with
diverse timescales by using a recurrent network of heterogeneous leaky
integrator neurons. In prediction tasks with fast-slow chaotic dynamical
systems including a large gap in timescales of their subsystems dynamics, we
demonstrate that the proposed model has a higher potential than the existing
standard model and yields a performance comparable to the best one of the
standard model even without an optimization of the leak rate parameter. Our
analysis reveals that the timescales required for producing each component of
target dynamics are appropriately and flexibly selected from the reservoir
dynamics by model training.
Related papers
- eXponential FAmily Dynamical Systems (XFADS): Large-scale nonlinear Gaussian state-space modeling [9.52474299688276]
We introduce a low-rank structured variational autoencoder framework for nonlinear state-space graphical models.
We show that our approach consistently demonstrates the ability to learn a more predictive generative model.
arXiv Detail & Related papers (2024-03-03T02:19:49Z) - CoDBench: A Critical Evaluation of Data-driven Models for Continuous
Dynamical Systems [8.410938527671341]
We introduce CodBench, an exhaustive benchmarking suite comprising 11 state-of-the-art data-driven models for solving differential equations.
Specifically, we evaluate 4 distinct categories of models, viz., feed forward neural networks, deep operator regression models, frequency-based neural operators, and transformer architectures.
We conduct extensive experiments, assessing the operators' capabilities in learning, zero-shot super-resolution, data efficiency, robustness to noise, and computational efficiency.
arXiv Detail & Related papers (2023-10-02T21:27:54Z) - Cheap and Deterministic Inference for Deep State-Space Models of
Interacting Dynamical Systems [38.23826389188657]
We present a deep state-space model which employs graph neural networks in order to model the underlying interacting dynamical system.
The predictive distribution is multimodal and has the form of a Gaussian mixture model, where the moments of the Gaussian components can be computed via deterministic moment matching rules.
Our moment matching scheme can be exploited for sample-free inference, leading to more efficient and stable training compared to Monte Carlo alternatives.
arXiv Detail & Related papers (2023-05-02T20:30:23Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Decomposed Linear Dynamical Systems (dLDS) for learning the latent
components of neural dynamics [6.829711787905569]
We propose a new decomposed dynamical system model that represents complex non-stationary and nonlinear dynamics of time series data.
Our model is trained through a dictionary learning procedure, where we leverage recent results in tracking sparse vectors over time.
In both continuous-time and discrete-time instructional examples we demonstrate that our model can well approximate the original system.
arXiv Detail & Related papers (2022-06-07T02:25:38Z) - Gradient-Based Trajectory Optimization With Learned Dynamics [80.41791191022139]
We use machine learning techniques to learn a differentiable dynamics model of the system from data.
We show that a neural network can model highly nonlinear behaviors accurately for large time horizons.
In our hardware experiments, we demonstrate that our learned model can represent complex dynamics for both the Spot and Radio-controlled (RC) car.
arXiv Detail & Related papers (2022-04-09T22:07:34Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Model-Free Prediction of Chaotic Systems Using High Efficient
Next-generation Reservoir Computing [4.284497690098487]
A new paradigm of reservoir computing is proposed for achieving model-free predication for both low-dimensional and large chaotic systems.
By taking the Lorenz and Kuramoto-Sivashinsky equations as two classical examples of dynamical systems, numerical simulations are conducted.
The results show our model excels at predication tasks than the latest reservoir computing methods.
arXiv Detail & Related papers (2021-10-19T12:49:24Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Trajectory-wise Multiple Choice Learning for Dynamics Generalization in
Reinforcement Learning [137.39196753245105]
We present a new model-based reinforcement learning algorithm that learns a multi-headed dynamics model for dynamics generalization.
We incorporate context learning, which encodes dynamics-specific information from past experiences into the context latent vector.
Our method exhibits superior zero-shot generalization performance across a variety of control tasks, compared to state-of-the-art RL methods.
arXiv Detail & Related papers (2020-10-26T03:20:42Z) - Hybrid modeling: Applications in real-time diagnosis [64.5040763067757]
We outline a novel hybrid modeling approach that combines machine learning inspired models and physics-based models.
We are using such models for real-time diagnosis applications.
arXiv Detail & Related papers (2020-03-04T00:44:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.