Next-Generation Reservoir Computing for Dynamical Inference
- URL: http://arxiv.org/abs/2509.11338v1
- Date: Sun, 14 Sep 2025 16:28:48 GMT
- Title: Next-Generation Reservoir Computing for Dynamical Inference
- Authors: Rok Cestnik, Erik A. Martens,
- Abstract summary: We present a simple and scalable implementation of next-generation reservoir computing for modeling dynamical systems from time series data.<n>Our approach uses a pseudorandom nonlinear projection of time-delay embedded, allowing an arbitrary dimension of the feature space.<n>We apply the method to benchmark tasks -- including attractor reconstruction and bifurcation diagram estimation -- using only partial and noisy observations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a simple and scalable implementation of next-generation reservoir computing for modeling dynamical systems from time series data. Our approach uses a pseudorandom nonlinear projection of time-delay embedded input, allowing an arbitrary dimension of the feature space, thus providing a flexible alternative to the polynomial-based projections used in previous next-generation reservoir computing variants. We apply the method to benchmark tasks -- including attractor reconstruction and bifurcation diagram estimation -- using only partial and noisy observations. We also include an exploratory example of estimating asymptotic oscillation phases. The models remain stable over long rollouts and generalize beyond training data. This framework enables the precise control of system state and is well suited for surrogate modeling and digital twin applications.
Related papers
- Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - Fast Estimation of Bayesian State Space Models Using Amortized
Simulation-Based Inference [0.0]
This paper presents a fast algorithm for estimating hidden states of Bayesian state space models.
After pretraining, finding the posterior distribution for any dataset takes from hundredths to tenths of a second.
arXiv Detail & Related papers (2022-10-13T16:37:05Z) - Learning and Inference in Sparse Coding Models with Langevin Dynamics [3.0600309122672726]
We describe a system capable of inference and learning in a probabilistic latent variable model.
We demonstrate this idea for a sparse coding model by deriving a continuous-time equation for inferring its latent variables via Langevin dynamics.
We show that Langevin dynamics lead to an efficient procedure for sampling from the posterior distribution in the 'L0 sparse' regime, where latent variables are encouraged to be set to zero as opposed to having a small L1 norm.
arXiv Detail & Related papers (2022-04-23T23:16:47Z) - `Next Generation' Reservoir Computing: an Empirical Data-Driven
Expression of Dynamical Equations in Time-Stepping Form [0.0]
Next generation reservoir computing based on nonlinear vector autoregression is applied to emulate simple dynamical system models.
It is also shown that the approach can be extended to produce high-order numerical schemes directly from data.
The impacts of the presence of noise and temporal sparsity in the training set is examined to gauge the potential use of this method for more realistic applications.
arXiv Detail & Related papers (2022-01-13T20:13:33Z) - Time varying regression with hidden linear dynamics [74.9914602730208]
We revisit a model for time-varying linear regression that assumes the unknown parameters evolve according to a linear dynamical system.
Counterintuitively, we show that when the underlying dynamics are stable the parameters of this model can be estimated from data by combining just two ordinary least squares estimates.
arXiv Detail & Related papers (2021-12-29T23:37:06Z) - Model-Free Prediction of Chaotic Systems Using High Efficient
Next-generation Reservoir Computing [4.284497690098487]
A new paradigm of reservoir computing is proposed for achieving model-free predication for both low-dimensional and large chaotic systems.
By taking the Lorenz and Kuramoto-Sivashinsky equations as two classical examples of dynamical systems, numerical simulations are conducted.
The results show our model excels at predication tasks than the latest reservoir computing methods.
arXiv Detail & Related papers (2021-10-19T12:49:24Z) - Deep Probabilistic Time Series Forecasting using Augmented Recurrent
Input for Dynamic Systems [12.319812075685956]
We combine the advances in both deep generative models and state space model (SSM) to come up with a novel, data-driven deep probabilistic sequence model.
Specially, we follow the popular encoder-decoder generative structure to build the recurrent neural networks (RNN) assisted variational sequence model.
In order to alleviate the issue of inconsistency between training and predicting, we (i) propose using a hybrid output as input at next time step, which brings training and predicting into alignment.
arXiv Detail & Related papers (2021-06-03T23:41:11Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.