Embedding Theory of Reservoir Computing and Reducing Reservoir Network
Using Time Delays
- URL: http://arxiv.org/abs/2303.09042v2
- Date: Tue, 9 May 2023 03:00:28 GMT
- Title: Embedding Theory of Reservoir Computing and Reducing Reservoir Network
Using Time Delays
- Authors: Xing-Yue Duan, Xiong Ying, Si-Yang Leng, J\"urgen Kurths, Wei Lin,
Huan-Fei Ma
- Abstract summary: Reservoir computing (RC) is under explosive development due to its exceptional efficacy and high performance in reconstruction or/and prediction of complex physical systems.
Here, we rigorously prove that RC is essentially a high dimensional embedding of the original input nonlinear dynamical system.
We significantly reduce the network size of RC for reconstructing and predicting some representative physical systems, and, more surprisingly, only using a single neuron reservoir with time delays is sometimes sufficient for achieving those tasks.
- Score: 6.543793376734818
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reservoir computing (RC), a particular form of recurrent neural network, is
under explosive development due to its exceptional efficacy and high
performance in reconstruction or/and prediction of complex physical systems.
However, the mechanism triggering such effective applications of RC is still
unclear, awaiting deep and systematic exploration. Here, combining the delayed
embedding theory with the generalized embedding theory, we rigorously prove
that RC is essentially a high dimensional embedding of the original input
nonlinear dynamical system. Thus, using this embedding property, we unify into
a universal framework the standard RC and the time-delayed RC where we novelly
introduce time delays only into the network's output layer, and we further find
a trade-off relation between the time delays and the number of neurons in RC.
Based on this finding, we significantly reduce the network size of RC for
reconstructing and predicting some representative physical systems, and, more
surprisingly, only using a single neuron reservoir with time delays is
sometimes sufficient for achieving those tasks.
Related papers
- Oscillations enhance time-series prediction in reservoir computing with feedback [3.3686252536891454]
Reservoir computing is a machine learning framework used for modeling the brain.
It is difficult to accurately reproduce the long-term target time series because the reservoir system becomes unstable.
This study proposes oscillation-driven reservoir computing (ODRC) with feedback.
arXiv Detail & Related papers (2024-06-05T02:30:29Z) - TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential
Modelling [54.97005925277638]
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
It remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues.
We propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF.
arXiv Detail & Related papers (2023-08-25T08:54:41Z) - Universal Approximation of Linear Time-Invariant (LTI) Systems through RNNs: Power of Randomness in Reservoir Computing [19.995241682744567]
Reservoir computing (RC) is a special RNN where the recurrent weights are randomized and left untrained.
We show that RC can universally approximate a general linear time-invariant (LTI) system.
arXiv Detail & Related papers (2023-08-04T17:04:13Z) - Long Short-term Memory with Two-Compartment Spiking Neuron [64.02161577259426]
We propose a novel biologically inspired Long Short-Term Memory Leaky Integrate-and-Fire spiking neuron model, dubbed LSTM-LIF.
Our experimental results, on a diverse range of temporal classification tasks, demonstrate superior temporal classification capability, rapid training convergence, strong network generalizability, and high energy efficiency of the proposed LSTM-LIF model.
This work, therefore, opens up a myriad of opportunities for resolving challenging temporal processing tasks on emerging neuromorphic computing machines.
arXiv Detail & Related papers (2023-07-14T08:51:03Z) - Gated Recurrent Neural Networks with Weighted Time-Delay Feedback [59.125047512495456]
We introduce a novel gated recurrent unit (GRU) with a weighted time-delay feedback mechanism.
We show that $tau$-GRU can converge faster and generalize better than state-of-the-art recurrent units and gated recurrent architectures.
arXiv Detail & Related papers (2022-12-01T02:26:34Z) - Unsupervised Monocular Depth Learning with Integrated Intrinsics and
Spatio-Temporal Constraints [61.46323213702369]
This work presents an unsupervised learning framework that is able to predict at-scale depth maps and egomotion.
Our results demonstrate strong performance when compared to the current state-of-the-art on multiple sequences of the KITTI driving dataset.
arXiv Detail & Related papers (2020-11-02T22:26:58Z) - Time-Reversal Symmetric ODE Network [138.02741983098454]
Time-reversal symmetry is a fundamental property that frequently holds in classical and quantum mechanics.
We propose a novel loss function that measures how well our ordinary differential equation (ODE) networks comply with this time-reversal symmetry.
We show that, even for systems that do not possess the full time-reversal symmetry, TRS-ODENs can achieve better predictive performances over baselines.
arXiv Detail & Related papers (2020-07-22T12:19:40Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z) - Model-Size Reduction for Reservoir Computing by Concatenating Internal
States Through Time [2.6872737601772956]
Reservoir computing (RC) is a machine learning algorithm that can learn complex time series from data very rapidly.
To implement RC in edge computing, it is highly important to reduce the amount of computational resources that RC requires.
We propose methods that reduce the size of the reservoir by inputting the past or drifting states of the reservoir to the output layer at the current time step.
arXiv Detail & Related papers (2020-06-11T06:11:03Z) - Sparsity in Reservoir Computing Neural Networks [3.55810827129032]
Reservoir Computing (RC) is a strategy for designing Recurrent Neural Networks featured by striking efficiency of training.
In this paper, we empirically investigate the role of sparsity in RC network design under the perspective of the richness of the developed temporal representations.
arXiv Detail & Related papers (2020-06-04T15:38:17Z) - RCNet: Incorporating Structural Information into Deep RNN for MIMO-OFDM
Symbol Detection with Limited Training [26.12840500767443]
We introduce the Time-Frequency RC to take advantage of the structural information inherent in OFDM signals.
We show that RCNet can offer a faster learning convergence and as much as 20% gain in bit error rate over a shallow RC structure.
arXiv Detail & Related papers (2020-03-15T21:06:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.