Prediction performance of random reservoirs with different topology for nonlinear dynamical systems with different number of degrees of freedom
- URL: http://arxiv.org/abs/2511.22059v1
- Date: Thu, 27 Nov 2025 03:24:06 GMT
- Title: Prediction performance of random reservoirs with different topology for nonlinear dynamical systems with different number of degrees of freedom
- Authors: Shailendra K. Rathor, Lina Jaurigue, Martin Ziegler, Jörg Schumacher,
- Abstract summary: Reservoir computing (RC) is a powerful framework for predicting nonlinear dynamical systems.<n>This work investigates how the structure of the network influences the performance of RC in four systems of increasing complexity.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reservoir computing (RC) is a powerful framework for predicting nonlinear dynamical systems, yet the role of reservoir topology$-$particularly symmetry in connectivity and weights$-$remains not adequately understood. This work investigates how the structure of the network influences the performance of RC in four systems of increasing complexity: the Mackey-Glass system with delayed-feedback, two low-dimensional thermal convection models, and a three-dimensional shear flow model exhibiting transition to turbulence. Using five reservoir topologies in which connectivity patterns and edge weights are controlled independently, we evaluate both direct- and cross-prediction tasks. The results show that symmetric reservoir networks substantially improve prediction accuracy for the convection-based systems, especially when the input dimension is smaller than the number of degrees of freedom. In contrast, the shear-flow model displays almost no sensitivity to topological symmetry due to its strongly chaotic high-dimensional dynamics. These findings reveal how structural properties of reservoir networks affect their ability to learn complex dynamics and provide guidance for designing more effective RC architectures.
Related papers
- Learning solution operator of dynamical systems with diffusion maps kernel ridge regression [2.7802667650114485]
We show that a simple kernel ridge regression (KRR) framework provides a strong baseline for long-term prediction of complex dynamical systems.<n>Across a broad range of systems, DM-KRR consistently outperforms state-of-the-art random feature, neural-network and operator-learning methods in both accuracy and data efficiency.
arXiv Detail & Related papers (2025-12-19T03:29:23Z) - On-line learning of dynamic systems: sparse regression meets Kalman filtering [41.99844472131922]
We extend sparsity-driven approaches to real-time learning by integrating a cornerstone algorithm from control theory -- the Kalman filter (KF)<n>The resulting Sindy Kalman Filter (KF) unifies both frameworks by treating unknown system parameters as state variables.<n>We demonstrate KF's effectiveness in the real-time identification of a sparse nonlinear aircraft model built from real flight data.
arXiv Detail & Related papers (2025-11-14T11:24:05Z) - Tensor Network Framework for Forecasting Nonlinear and Chaotic Dynamics [1.790605517028706]
We present a tensor network model (TNM) for forecasting nonlinear and chaotic dynamics.<n>We show that the TNM accurately reconstructs short-term trajectories and faithfully captures the attractor geometry.
arXiv Detail & Related papers (2025-11-12T11:49:38Z) - KITINet: Kinetics Theory Inspired Network Architectures with PDE Simulation Approaches [43.872190335490515]
This paper introduces KITINet, a novel architecture that reinterprets feature propagation through the lens of non-equilibrium particle dynamics.<n>At its core, we propose a residual module that models update as the evolution of a particle system.<n>This formulation mimics particle collisions and energy exchange, enabling adaptive feature refinement via physics-informed interactions.
arXiv Detail & Related papers (2025-05-23T13:58:29Z) - Denoising and Reconstruction of Nonlinear Dynamics using Truncated Reservoir Computing [0.0]
This paper presents a novel Reservoir Computing (RC) method for noise filtering and reconstructing unobserved nonlinear dynamics.<n>The performance of the RC in terms of noise intensity, noise frequency content, and drastic shifts in dynamical parameters is studied.<n>The framework yields competitive accuracy at low signal-to-noise ratios and high-frequency ranges.
arXiv Detail & Related papers (2025-04-17T21:47:13Z) - Exceptional Points and Stability in Nonlinear Models of Population Dynamics having $\mathcal{PT}$ symmetry [49.1574468325115]
We analyze models governed by the replicator equation of evolutionary game theory and related Lotka-Volterra systems of population dynamics.<n>We study the emergence of exceptional points in two cases: (a) when the governing symmetry properties are tied to global properties of the models, and (b) when these symmetries emerge locally around stationary states.
arXiv Detail & Related papers (2024-11-19T02:15:59Z) - Generative Learning of the Solution of Parametric Partial Differential Equations Using Guided Diffusion Models and Virtual Observations [4.798951413107239]
We introduce a generative learning framework to model high-dimensional parametric systems.
We consider systems described by Partial Differential Equations (PDEs) discretized with structured or unstructured grids.
arXiv Detail & Related papers (2024-07-31T20:52:33Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - TANGO: Time-Reversal Latent GraphODE for Multi-Agent Dynamical Systems [43.39754726042369]
We propose a simple-yet-effective self-supervised regularization term as a soft constraint that aligns the forward and backward trajectories predicted by a continuous graph neural network-based ordinary differential equation (GraphODE)
It effectively imposes time-reversal symmetry to enable more accurate model predictions across a wider range of dynamical systems under classical mechanics.
Experimental results on a variety of physical systems demonstrate the effectiveness of our proposed method.
arXiv Detail & Related papers (2023-10-10T08:52:16Z) - Data-driven Nonlinear Parametric Model Order Reduction Framework using
Deep Hierarchical Variational Autoencoder [5.521324490427243]
Data-driven parametric model order reduction (MOR) method using a deep artificial neural network is proposed.
LSH-VAE is capable of performing nonlinear MOR for the parametric of a nonlinear dynamic system with a significant number of degrees of freedom.
arXiv Detail & Related papers (2023-07-10T02:44:53Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Optimal reservoir computers for forecasting systems of nonlinear
dynamics [0.0]
We show that reservoirs of low connectivity perform better than or as well as those of high connectivity in forecasting noiseless Lorenz and coupled Wilson-Cowan systems.
We also show that, unexpectedly, computationally effective reservoirs of unconnected nodes (RUN) outperform reservoirs of linked network topologies in predicting these systems.
arXiv Detail & Related papers (2022-02-09T09:36:31Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - Kernel and Rich Regimes in Overparametrized Models [69.40899443842443]
We show that gradient descent on overparametrized multilayer networks can induce rich implicit biases that are not RKHS norms.
We also demonstrate this transition empirically for more complex matrix factorization models and multilayer non-linear networks.
arXiv Detail & Related papers (2020-02-20T15:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.