Echo State Networks as State-Space Models: A Systems Perspective
- URL: http://arxiv.org/abs/2509.04422v1
- Date: Thu, 04 Sep 2025 17:42:03 GMT
- Title: Echo State Networks as State-Space Models: A Systems Perspective
- Authors: Pradeep Singh, Balasubramanian Raman,
- Abstract summary: We show that the echo-state property is an instance of input-to-state stability for a contractive nonlinear SSM.<n>We also develop two complementary mappings that yield locally valid LTI SSMs with interpretable poles and memory horizons.<n>This perspective yields frequency-domain characterizations of memory spectra and clarifies when ESNs emulate structured SSM kernels.
- Score: 10.710447183485284
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Echo State Networks (ESNs) are typically presented as efficient, readout-trained recurrent models, yet their dynamics and design are often guided by heuristics rather than first principles. We recast ESNs explicitly as state-space models (SSMs), providing a unified systems-theoretic account that links reservoir computing with classical identification and modern kernelized SSMs. First, we show that the echo-state property is an instance of input-to-state stability for a contractive nonlinear SSM and derive verifiable conditions in terms of leak, spectral scaling, and activation Lipschitz constants. Second, we develop two complementary mappings: (i) small-signal linearizations that yield locally valid LTI SSMs with interpretable poles and memory horizons; and (ii) lifted/Koopman random-feature expansions that render the ESN a linear SSM in an augmented state, enabling transfer-function and convolutional-kernel analyses. This perspective yields frequency-domain characterizations of memory spectra and clarifies when ESNs emulate structured SSM kernels. Third, we cast teacher forcing as state estimation and propose Kalman/EKF-assisted readout learning, together with EM for hyperparameters (leak, spectral radius, process/measurement noise) and a hybrid subspace procedure for spectral shaping under contraction constraints.
Related papers
- WaveSSM: Multiscale State-Space Models for Non-stationary Signal Attention [22.983737182781244]
State-space models (SSMs) have emerged as a powerful foundation for long-range sequence modeling.<n>We introduce emphWaveSSM, a collection of SSMs constructed over wavelet frames.<n>Our key observation is that wavelet frames yield a localized support on the temporal dimension, useful for tasks requiring precise localization.
arXiv Detail & Related papers (2026-02-25T06:27:22Z) - Contraction, Criticality, and Capacity: A Dynamical-Systems Perspective on Echo-State Networks [13.857230672081489]
We present a unified, dynamical-systems treatment that weaves together functional analysis, random attractor theory and recent neuroscientific findings.<n>First, we prove that the Echo-State Property (wash-out of initial conditions) together with global Lipschitz dynamics necessarily yields the Fading-Memory Property.<n>Second, employing a Stone-Weierstrass strategy we give a streamlined proof that ESNs with nonlinear reservoirs and linear read-outs are dense in the Banach space of causal, time-in fading-memory filters.<n>Third, we quantify computational resources via memory-capacity spectrum, show how
arXiv Detail & Related papers (2025-07-24T14:41:18Z) - Learning to Dissipate Energy in Oscillatory State-Space Models [55.09730499143998]
State-space models (SSMs) are a class of networks for sequence learning.<n>We show that D-LinOSS consistently outperforms previous LinOSS methods on long-range learning tasks.
arXiv Detail & Related papers (2025-05-17T23:15:17Z) - Understanding and Mitigating Bottlenecks of State Space Models through the Lens of Recency and Over-smoothing [56.66469232740998]
We show that Structured State Space Models (SSMs) are inherently limited by strong recency bias.<n>This bias impairs the models' ability to recall distant information and introduces robustness issues.<n>We propose to polarize two channels of the state transition matrices in SSMs, setting them to zero and one, respectively, simultaneously addressing recency bias and over-smoothing.
arXiv Detail & Related papers (2024-12-31T22:06:39Z) - Deep Learning-based Approaches for State Space Models: A Selective Review [15.295157876811066]
State-space models (SSMs) offer a powerful framework for dynamical system analysis.<n>This paper provides a selective review of recent advancements in deep neural network-based approaches for SSMs.
arXiv Detail & Related papers (2024-12-15T15:04:35Z) - Provable Benefits of Complex Parameterizations for Structured State Space Models [51.90574950170374]
Structured state space models (SSMs) are linear dynamical systems adhering to a specified structure.
In contrast to typical neural network modules, whose parameterizations are real, SSMs often use complex parameterizations.
This paper takes a step towards explaining the benefits of complex parameterizations for SSMs by establishing formal gaps between real and complex diagonal SSMs.
arXiv Detail & Related papers (2024-10-17T22:35:50Z) - Cross-Scan Mamba with Masked Training for Robust Spectral Imaging [51.557804095896174]
We propose the Cross-Scanning Mamba, named CS-Mamba, that employs a Spatial-Spectral SSM for global-local balanced context encoding.<n>Experiment results show that our CS-Mamba achieves state-of-the-art performance and the masked training method can better reconstruct smooth features to improve the visual quality.
arXiv Detail & Related papers (2024-08-01T15:14:10Z) - HiPPO-Prophecy: State-Space Models can Provably Learn Dynamical Systems in Context [0.5416466085090772]
This work explores the in-context learning capabilities of State Space Models (SSMs)<n>We introduce a novel weight construction for SSMs, enabling them to predict the next state of any dynamical system.<n>We extend the HiPPO framework to demonstrate that continuous SSMs can approximate the derivative of any input signal.
arXiv Detail & Related papers (2024-07-12T15:56:11Z) - HOPE for a Robust Parameterization of Long-memory State Space Models [51.66430224089725]
State-space models (SSMs) that utilize linear, time-invariant (LTI) systems are known for their effectiveness in learning long sequences.
We develop a new parameterization scheme, called HOPE, for LTI systems that utilize Markov parameters within Hankel operators.
Our new parameterization endows the SSM with non-decaying memory within a fixed time window, which is empirically corroborated by a sequential CIFAR-10 task with padded noise.
arXiv Detail & Related papers (2024-05-22T20:20:14Z) - Lipschitz Recurrent Neural Networks [100.72827570987992]
We show that our Lipschitz recurrent unit is more robust with respect to input and parameter perturbations as compared to other continuous-time RNNs.
Our experiments demonstrate that the Lipschitz RNN can outperform existing recurrent units on a range of benchmark tasks.
arXiv Detail & Related papers (2020-06-22T08:44:52Z) - Feedback-induced instabilities and dynamics in the Jaynes-Cummings model [62.997667081978825]
We investigate the coherence and steady-state properties of the Jaynes-Cummings model subjected to time-delayed coherent feedback.
The introduced feedback qualitatively modifies the dynamical response and steady-state quantum properties of the system.
arXiv Detail & Related papers (2020-06-20T10:07:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.