Reservoir Computing via Multi-Scale Random Fourier Features for Forecasting Fast-Slow Dynamical Systems
- URL: http://arxiv.org/abs/2511.14775v1
- Date: Tue, 04 Nov 2025 08:01:08 GMT
- Title: Reservoir Computing via Multi-Scale Random Fourier Features for Forecasting Fast-Slow Dynamical Systems
- Authors: S. K. Laha,
- Abstract summary: We present a novel reservoir computing framework that combines delay embedding with random Fourier feature (RFF) mappings to capture such dynamics.<n>Two formulations are investigated: a single-scale RFF reservoir, which employs a fixed kernel bandwidth, and a multi-scale RFF reservoir, which integrates multiple bandwidths to represent both fast and slow temporal dependencies.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Forecasting nonlinear time series with multi-scale temporal structures remains a central challenge in complex systems modeling. We present a novel reservoir computing framework that combines delay embedding with random Fourier feature (RFF) mappings to capture such dynamics. Two formulations are investigated: a single-scale RFF reservoir, which employs a fixed kernel bandwidth, and a multi-scale RFF reservoir, which integrates multiple bandwidths to represent both fast and slow temporal dependencies. The framework is applied to a diverse set of canonical systems: neuronal models such as the Rulkov map, Izhikevich model, Hindmarsh-Rose model, and Morris-Lecar model, which exhibit spiking, bursting, and chaotic behaviors arising from fast-slow interactions; and ecological models including the predator-prey dynamics and Ricker map with seasonal forcing, which display multi-scale oscillations and intermittency. Across all cases, the multi-scale RFF reservoir consistently outperforms its single-scale counterpart, achieving lower normalized root mean square error (NRMSE) and more robust long-horizon predictions. These results highlight the effectiveness of explicitly incorporating multi-scale feature mappings into reservoir computing architectures for modeling complex dynamical systems with intrinsic fast-slow interactions.
Related papers
- ACFormer: Mitigating Non-linearity with Auto Convolutional Encoder for Time Series Forecasting [6.27761817493579]
Time series forecasting (TSF) faces challenges in modeling complex intra-channel temporal dependencies and inter-channel correlations.<n>We propose ACFormer, an architecture designed to reconcile the efficiency of linear projections with the non-linear feature-extraction power of convolutions.
arXiv Detail & Related papers (2026-01-28T13:47:54Z) - A Novel Reservoir Computing Framework for Chaotic Time Series Prediction Using Time Delay Embedding and Random Fourier Features [0.0]
We introduce a novel reservoir computing framework that integrates time-delay embedding with Random Fourier Feature mappings.<n>RFF-RC approximates nonlinear kernel transformations that uncover latent dynamical relations in the reconstructed phase space.<n>Results show that the combination of delay embedding and RFF-based reservoirs reveals new dynamical structure.
arXiv Detail & Related papers (2025-11-04T07:59:08Z) - Oscillatory State-Space Models [61.923849241099184]
We propose Lineary State-Space models (LinOSS) for efficiently learning on long sequences.<n>A stable discretization, integrated over time using fast associative parallel scans, yields the proposed state-space model.<n>We show that LinOSS is universal, i.e., it can approximate any continuous and causal operator mapping between time-varying functions.
arXiv Detail & Related papers (2024-10-04T22:00:13Z) - Adaptive Multi-Scale Decomposition Framework for Time Series Forecasting [26.141054975797868]
We propose a novel Adaptive Multi-Scale Decomposition (AMD) framework for time series forecasting.<n>Our framework decomposes time series into distinct temporal patterns at multiple scales, leveraging the Multi-Scale Decomposable Mixing (MDM) block.<n>Our approach effectively models both temporal and channel dependencies and utilizes autocorrelation to refine multi-scale data integration.
arXiv Detail & Related papers (2024-06-06T05:27:33Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - TMS: A Temporal Multi-scale Backbone Design for Speaker Embedding [60.292702363839716]
Current SOTA backbone networks for speaker embedding are designed to aggregate multi-scale features from an utterance with multi-branch network architectures for speaker representation.
We propose an effective temporal multi-scale (TMS) model where multi-scale branches could be efficiently designed in a speaker embedding network almost without increasing computational costs.
arXiv Detail & Related papers (2022-03-17T05:49:35Z) - Predicting the temporal dynamics of turbulent channels through deep
learning [0.0]
We aim to assess the capability of neural networks to reproduce the temporal evolution of a minimal turbulent channel flow.
Long-short-term-memory (LSTM) networks and a Koopman-based framework (KNF) are trained to predict the temporal dynamics of the minimal-channel-flow modes.
arXiv Detail & Related papers (2022-03-02T09:31:03Z) - Predicting Physics in Mesh-reduced Space with Temporal Attention [15.054026802351146]
We propose a new method that captures long-term dependencies through a transformer-style temporal attention model.
Our method outperforms a competitive GNN baseline on several complex fluid dynamics prediction tasks.
We believe our approach paves the way to bringing the benefits of attention-based sequence models to solving high-dimensional complex physics tasks.
arXiv Detail & Related papers (2022-01-22T18:32:54Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Hierarchical Deep Learning of Multiscale Differential Equation
Time-Steppers [5.6385744392820465]
We develop a hierarchy of deep neural network time-steppers to approximate the flow map of the dynamical system over a disparate range of time-scales.
The resulting model is purely data-driven and leverages features of the multiscale dynamics.
We benchmark our algorithm against state-of-the-art methods, such as LSTM, reservoir computing, and clockwork RNN.
arXiv Detail & Related papers (2020-08-22T07:16:53Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.