Task Agnostic Metrics for Reservoir Computing
- URL: http://arxiv.org/abs/2108.01512v1
- Date: Tue, 3 Aug 2021 13:58:11 GMT
- Title: Task Agnostic Metrics for Reservoir Computing
- Authors: Jake Love, Jeroen Mulkers, George Bourianoff, Jonathan Leliaert and
Karin Everschor-Sitte
- Abstract summary: Physical reservoir computing is a computational paradigm that enables temporal pattern recognition in physical matter.
The chosen dynamical system must have three desirable properties: non-linearity, complexity, and fading memory.
We show that, in general, systems with lower damping reach higher values in all three performance metrics.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physical reservoir computing is a computational paradigm that enables
temporal pattern recognition to be performed directly in physical matter. By
exciting non-linear dynamical systems and linearly classifying their changes in
state, we can create highly energy-efficient devices capable of solving machine
learning tasks without the need to build a modular system consisting of
millions of neurons interconnected by synapses. The chosen dynamical system
must have three desirable properties: non-linearity, complexity, and fading
memory to act as an effective reservoir. We present task agnostic quantitative
measures for each of these three requirements and exemplify them for two
reservoirs: an echo state network and a simulated magnetic skyrmion-based
reservoir. We show that, in general, systems with lower damping reach higher
values in all three performance metrics. Whilst for input signal strength,
there is a natural trade-off between memory capacity and non-linearity of the
reservoir's behaviour. In contrast to typical task-dependent reservoir
computing benchmarks, these metrics can be evaluated in parallel from a single
input signal, drastically speeding up the parameter search to design efficient
and high-performance reservoirs.
Related papers
- Brain-Inspired Reservoir Computing Using Memristors with Tunable
Dynamics and Short-Term Plasticity [0.0]
We show that reservoir layers constructed with a small number of distinct memristors exhibit significantly higher predictive and classification accuracies with a single data encoding.
In a neural activity classification task, a reservoir of just three distinct memristors experimentally attained an accuracy of 96.5%.
arXiv Detail & Related papers (2023-10-25T03:27:43Z) - Heterogenous Memory Augmented Neural Networks [84.29338268789684]
We introduce a novel heterogeneous memory augmentation approach for neural networks.
By introducing learnable memory tokens with attention mechanism, we can effectively boost performance without huge computational overhead.
We show our approach on various image and graph-based tasks under both in-distribution (ID) and out-of-distribution (OOD) conditions.
arXiv Detail & Related papers (2023-10-17T01:05:28Z) - Controlling dynamical systems to complex target states using machine
learning: next-generation vs. classical reservoir computing [68.8204255655161]
Controlling nonlinear dynamical systems using machine learning allows to drive systems into simple behavior like periodicity but also to more complex arbitrary dynamics.
We show first that classical reservoir computing excels at this task.
In a next step, we compare those results based on different amounts of training data to an alternative setup, where next-generation reservoir computing is used instead.
It turns out that while delivering comparable performance for usual amounts of training data, next-generation RC significantly outperforms in situations where only very limited data is available.
arXiv Detail & Related papers (2023-07-14T07:05:17Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - ETLP: Event-based Three-factor Local Plasticity for online learning with
neuromorphic hardware [105.54048699217668]
We show a competitive performance in accuracy with a clear advantage in the computational complexity for Event-Based Three-factor Local Plasticity (ETLP)
We also show that when using local plasticity, threshold adaptation in spiking neurons and a recurrent topology are necessary to learntemporal patterns with a rich temporal structure.
arXiv Detail & Related papers (2023-01-19T19:45:42Z) - Reservoir Computing Using Complex Systems [0.0]
Reservoir Computing is a machine learning framework for utilising physical systems for computation.
We show how a single node reservoir can be employed for computation and explore the available options to improve the computational capability of the physical reservoirs.
arXiv Detail & Related papers (2022-12-17T00:25:56Z) - Learning unseen coexisting attractors [0.0]
Reservoir computing is a machine learning approach that can generate a surrogate model of a dynamical system.
Here, we study a challenging problem of learning a dynamical system that has both disparate time scales and multiple co-existing dynamical states (attractors)
We show that the next-generation reservoir computing approach uses $sim 1.7 times$ less training data, requires $103 times$ shorter warm up' time, and has an $sim 100times$ higher accuracy in predicting the co-existing attractor characteristics.
arXiv Detail & Related papers (2022-07-28T14:55:14Z) - Master memory function for delay-based reservoir computers with
single-variable dynamics [0.0]
We show that many delay-based reservoir computers can be characterized by a universal master memory function (MMF)
Once computed for two independent parameters, this function provides linear memory capacity for any delay-based single-variable reservoir with small inputs.
arXiv Detail & Related papers (2021-08-28T13:17:24Z) - Linear embedding of nonlinear dynamical systems and prospects for
efficient quantum algorithms [74.17312533172291]
We describe a method for mapping any finite nonlinear dynamical system to an infinite linear dynamical system (embedding)
We then explore an approach for approximating the resulting infinite linear system with finite linear systems (truncation)
arXiv Detail & Related papers (2020-12-12T00:01:10Z) - The Computational Capacity of LRC, Memristive and Hybrid Reservoirs [1.657441317977376]
Reservoir computing is a machine learning paradigm that uses a high-dimensional dynamical system, or emphreservoir, to approximate and predict time series data.
We analyze the feasibility and optimal design of electronic reservoirs that include both linear elements (resistors, inductors, and capacitors) and nonlinear memory elements called memristors.
Our electronic reservoirs can match or exceed the performance of conventional "echo state network" reservoirs in a form that may be directly implemented in hardware.
arXiv Detail & Related papers (2020-08-31T21:24:45Z) - Active Learning for Nonlinear System Identification with Guarantees [102.43355665393067]
We study a class of nonlinear dynamical systems whose state transitions depend linearly on a known feature embedding of state-action pairs.
We propose an active learning approach that achieves this by repeating three steps: trajectory planning, trajectory tracking, and re-estimation of the system from all available data.
We show that our method estimates nonlinear dynamical systems at a parametric rate, similar to the statistical rate of standard linear regression.
arXiv Detail & Related papers (2020-06-18T04:54:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.