Time-shift selection for reservoir computing using a rank-revealing QR
algorithm
- URL: http://arxiv.org/abs/2211.17095v3
- Date: Tue, 25 Apr 2023 22:24:31 GMT
- Title: Time-shift selection for reservoir computing using a rank-revealing QR
algorithm
- Authors: Joseph D. Hart and Francesco Sorrentino and Thomas L. Carroll
- Abstract summary: We present a technique to choose the time-shifts by maximizing the rank of the reservoir matrix using a rank-revealing QR algorithm.
We find that our technique provides improved accuracy over random time-shift selection in essentially all cases.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reservoir computing, a recurrent neural network paradigm in which only the
output layer is trained, has demonstrated remarkable performance on tasks such
as prediction and control of nonlinear systems. Recently, it was demonstrated
that adding time-shifts to the signals generated by a reservoir can provide
large improvements in performance accuracy. In this work, we present a
technique to choose the time-shifts by maximizing the rank of the reservoir
matrix using a rank-revealing QR algorithm. This technique, which is not task
dependent, does not require a model of the system, and therefore is directly
applicable to analog hardware reservoir computers. We demonstrate our
time-shift selection technique on two types of reservoir computer: one based on
an opto-electronic oscillator and the traditional recurrent network with a
$tanh$ activation function. We find that our technique provides improved
accuracy over random time-shift selection in essentially all cases.
Related papers
- Reservoir computing with logistic map [0.0]
We demonstrate here a method to predict temporal and nontemporal tasks by constructing virtual nodes as constituting a reservoir in reservoir computing.
We predict three nonlinear systems, namely Lorenz, Rossler, and Hindmarsh-Rose, for temporal tasks and a seventh order for nontemporal tasks with great accuracy.
Remarkably, the logistic map performs well and predicts close to the actual or target values.
arXiv Detail & Related papers (2024-01-17T09:22:15Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - A Stable, Fast, and Fully Automatic Learning Algorithm for Predictive
Coding Networks [65.34977803841007]
Predictive coding networks are neuroscience-inspired models with roots in both Bayesian statistics and neuroscience.
We show how by simply changing the temporal scheduling of the update rule for the synaptic weights leads to an algorithm that is much more efficient and stable than the original one.
arXiv Detail & Related papers (2022-11-16T00:11:04Z) - RSC: Accelerating Graph Neural Networks Training via Randomized Sparse
Computations [56.59168541623729]
Training graph neural networks (GNNs) is time consuming because sparse graph-based operations are hard to be accelerated by hardware.
We explore trading off the computational precision to reduce the time complexity via sampling-based approximation.
We propose Randomized Sparse Computation, which for the first time demonstrate the potential of training GNNs with approximated operations.
arXiv Detail & Related papers (2022-10-19T17:25:33Z) - A Robust and Explainable Data-Driven Anomaly Detection Approach For
Power Electronics [56.86150790999639]
We present two anomaly detection and classification approaches, namely the Matrix Profile algorithm and anomaly transformer.
The Matrix Profile algorithm is shown to be well suited as a generalizable approach for detecting real-time anomalies in streaming time-series data.
A series of custom filters is created and added to the detector to tune its sensitivity, recall, and detection accuracy.
arXiv Detail & Related papers (2022-09-23T06:09:35Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Next Generation Reservoir Computing [0.0]
Reservoir computing is a best-in-class machine learning algorithm for processing information generated by dynamical systems.
It requires very small training data sets, uses linear optimization, and thus requires minimal computing resources.
Recent results demonstrate the equivalence of reservoir computing to nonlinear vector autoregression.
arXiv Detail & Related papers (2021-06-14T18:12:10Z) - Exploiting Multiple Timescales in Hierarchical Echo State Networks [0.0]
Echo state networks (ESNs) are a powerful form of reservoir computing that only require training of linear output weights.
Here we explore the timescales in hierarchical ESNs, where the reservoir is partitioned into two smaller reservoirs linked with distinct properties.
arXiv Detail & Related papers (2021-01-11T22:33:17Z) - P-CRITICAL: A Reservoir Autoregulation Plasticity Rule for Neuromorphic
Hardware [4.416484585765027]
Backpropagation algorithms on recurrent artificial neural networks require an unfolding of accumulated states over time.
We propose a new local plasticity rule named P-CRITICAL designed for automatic reservoir tuning.
We observe an improved performance on tasks coming from various modalities without the need to tune parameters.
arXiv Detail & Related papers (2020-09-11T18:13:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.