Dynamics-Informed Reservoir Computing with Visibility Graphs
- URL: http://arxiv.org/abs/2507.19046v1
- Date: Fri, 25 Jul 2025 08:07:17 GMT
- Title: Dynamics-Informed Reservoir Computing with Visibility Graphs
- Authors: Charlotte Geier, Merten Stender,
- Abstract summary: Reservoir computing offers a computationally efficient alternative to traditional deep learning.<n>Despite its advantages, the largely random reservoir graph architecture often results in suboptimal networks with poorly understood dynamics.<n>We propose a novel Dynamics-Informed Reservoir Computing framework that systematically infers the reservoir network structure directly from the input training sequence.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Accurate prediction of complex and nonlinear time series remains a challenging problem across engineering and scientific disciplines. Reservoir computing (RC) offers a computationally efficient alternative to traditional deep learning by training only the read-out layer while employing a randomly structured and fixed reservoir network. Despite its advantages, the largely random reservoir graph architecture often results in suboptimal and oversized networks with poorly understood dynamics. Addressing this issue, we propose a novel Dynamics-Informed Reservoir Computing (DyRC) framework that systematically infers the reservoir network structure directly from the input training sequence. This work proposes to employ the visibility graph (VG) technique, which converts time series data into networks by representing measurement points as nodes linked by mutual visibility. The reservoir network is constructed by directly adopting the VG network from a training data sequence, leveraging the parameter-free visibility graph approach to avoid expensive hyperparameter tuning. This process results in a reservoir that is directly informed by the specific dynamics of the prediction task under study. We assess the DyRC-VG method through prediction tasks involving the canonical nonlinear Duffing oscillator, evaluating prediction accuracy and consistency. Compared to an Erd\H{o}s-R\'enyi graph of the same size, spectral radius, and comparable density, we observe higher prediction quality and more consistent performance over repeated implementations in the DyRC-VG.
Related papers
- A Spectral Interpretation of Redundancy in a Graph Reservoir [51.40366905583043]
This work revisits the definition of the reservoir in the Multiresolution Reservoir Graph Neural Network (MRGNN)<n>It proposes a variant based on a Fairing algorithm originally introduced in the field of surface design in computer graphics.<n>The core contribution of the paper lies in the theoretical analysis of the algorithm from a random walks perspective.
arXiv Detail & Related papers (2025-07-17T10:02:57Z) - A tensor network approach for chaotic time series prediction [1.03590082373586]
This paper explores the application of a previously proposed tensor network model for predicting chaotic time series.<n>It shows its advantages in terms of accuracy and computational efficiency compared to conventional echo state networks.<n>Using a state-of-the-art tensor network approach, we bridge the gap between the tensor network and reservoir computing communities.
arXiv Detail & Related papers (2025-05-23T11:03:35Z) - Adaptive Homophily Clustering: Structure Homophily Graph Learning with Adaptive Filter for Hyperspectral Image [21.709368882043897]
Hyperspectral image (HSI) clustering has been a fundamental but challenging task with zero training labels.<n>In this paper, a homophily structure graph learning with an adaptive filter clustering method (AHSGC) for HSI is proposed.<n>Our AHSGC contains high clustering accuracy, low computational complexity, and strong robustness.
arXiv Detail & Related papers (2025-01-03T01:54:16Z) - Graph Neural Networks and Differential Equations: A hybrid approach for data assimilation of fluid flows [0.0]
This study presents a novel hybrid approach that combines Graph Neural Networks (GNNs) with Reynolds-Averaged Navier Stokes (RANS) equations.
The results demonstrate significant improvements in the accuracy of the reconstructed mean flow compared to purely data-driven models.
arXiv Detail & Related papers (2024-11-14T14:31:52Z) - Flow reconstruction in time-varying geometries using graph neural networks [1.0485739694839669]
The model incorporates a feature propagation algorithm as a preprocessing step to handle extremely sparse inputs.
A binary indicator is introduced as a validity mask to distinguish between the original and propagated data points.
The model is trained on a unique data set of Direct Numerical Simulations (DNS) of a motored engine at a technically relevant operating condition.
arXiv Detail & Related papers (2024-11-13T16:49:56Z) - Deep Recurrent Stochastic Configuration Networks for Modelling Nonlinear Dynamic Systems [3.8719670789415925]
This paper proposes a novel deep reservoir computing framework, termed deep recurrent configuration network (DeepRSCN)
DeepRSCNs are incrementally constructed, with all reservoir nodes directly linked to the final output.
Given a set of training samples, DeepRSCNs can quickly generate learning representations, which consist of random basis functions with cascaded input readout weights.
arXiv Detail & Related papers (2024-10-28T10:33:15Z) - EGRC-Net: Embedding-induced Graph Refinement Clustering Network [66.44293190793294]
We propose a novel graph clustering network called Embedding-Induced Graph Refinement Clustering Network (EGRC-Net)
EGRC-Net effectively utilizes the learned embedding to adaptively refine the initial graph and enhance the clustering performance.
Our proposed methods consistently outperform several state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-19T09:08:43Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Emulating Spatio-Temporal Realizations of Three-Dimensional Isotropic
Turbulence via Deep Sequence Learning Models [24.025975236316842]
We use a data-driven approach to model a three-dimensional turbulent flow using cutting-edge Deep Learning techniques.
The accuracy of the model is assessed using statistical and physics-based metrics.
arXiv Detail & Related papers (2021-12-07T03:33:39Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Learning to Hash with Graph Neural Networks for Recommender Systems [103.82479899868191]
Graph representation learning has attracted much attention in supporting high quality candidate search at scale.
Despite its effectiveness in learning embedding vectors for objects in the user-item interaction network, the computational costs to infer users' preferences in continuous embedding space are tremendous.
We propose a simple yet effective discrete representation learning framework to jointly learn continuous and discrete codes.
arXiv Detail & Related papers (2020-03-04T06:59:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.