Universality conditions of unified classical and quantum reservoir computing
- URL: http://arxiv.org/abs/2401.15067v3
- Date: Mon, 20 May 2024 09:36:44 GMT
- Title: Universality conditions of unified classical and quantum reservoir computing
- Authors: Francesco Monzani, Enrico Prati,
- Abstract summary: Reservoir computing is a versatile paradigm in computational neuroscience and machine learning.
We present a unified theoretical framework and propose a ready-made setting to secure universality.
- Score: 1.1510009152620668
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reservoir computing is a versatile paradigm in computational neuroscience and machine learning, that exploits the non-linear dynamics of a dynamical system - the reservoir - to efficiently process time-dependent information. Since its introduction, it has exhibited remarkable capabilities in various applications. As widely known, classes of reservoir computers serve as universal approximators of functionals with fading memory. The construction of such universal classes often appears context-specific, but in fact, they follow the same principles. Here we present a unified theoretical framework and we propose a ready-made setting to secure universality. We test the result in the arising context of quantum reservoir computing.The analysis sheds light on a unified view of classical and quantum reservoir computing.
Related papers
- Theoretical framework for quantum associative memories [0.8437187555622164]
Associative memory refers to the ability to relate a memory with an input and targets the restoration of corrupted patterns.
We develop a comprehensive framework for a quantum associative memory based on open quantum system dynamics.
arXiv Detail & Related papers (2024-08-26T13:46:47Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Stochastic Reservoir Computers [0.0]
In reservoir computing, the number of distinct states of the entire reservoir computer can potentially scale exponentially with the size of the reservoir hardware.
While shot noise is a limiting factor in the performance of reservoir computing, we show significantly improved performance compared to a reservoir computer with similar hardware in cases where the effects of noise are small.
arXiv Detail & Related papers (2024-05-20T21:26:00Z) - Reservoir Computing with Generalized Readout based on Generalized Synchronization [0.0]
Reservoir computing is a machine learning framework that exploits nonlinear dynamics.
We propose a novel reservoir computing framework with generalized readout, including a nonlinear combination of reservoir variables.
In a numerical study, we find that introducing the generalized readout leads to a significant improvement in accuracy and an unexpected enhancement in robustness.
arXiv Detail & Related papers (2024-05-03T10:03:59Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Controlling dynamical systems to complex target states using machine
learning: next-generation vs. classical reservoir computing [68.8204255655161]
Controlling nonlinear dynamical systems using machine learning allows to drive systems into simple behavior like periodicity but also to more complex arbitrary dynamics.
We show first that classical reservoir computing excels at this task.
In a next step, we compare those results based on different amounts of training data to an alternative setup, where next-generation reservoir computing is used instead.
It turns out that while delivering comparable performance for usual amounts of training data, next-generation RC significantly outperforms in situations where only very limited data is available.
arXiv Detail & Related papers (2023-07-14T07:05:17Z) - Learnability with Time-Sharing Computational Resource Concerns [65.268245109828]
We present a theoretical framework that takes into account the influence of computational resources in learning theory.
This framework can be naturally applied to stream learning where the incoming data streams can be potentially endless.
It may also provide a theoretical perspective for the design of intelligent supercomputing operating systems.
arXiv Detail & Related papers (2023-05-03T15:54:23Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Natural quantum reservoir computing for temporal information processing [4.785845498722406]
Reservoir computing is a temporal information processing system that exploits artificial or physical dissipative dynamics.
This paper proposes the use of real superconducting quantum computing devices as the reservoir, where the dissipative property is served by the natural noise added to the quantum bits.
arXiv Detail & Related papers (2021-07-13T01:58:57Z) - Gaussian states of continuous-variable quantum systems provide universal
and versatile reservoir computing [0.0]
We consider reservoir computing, an efficient framework for online time series processing.
We find that encoding the input time series into Gaussian states is both a source and a means to tune the nonlinearity of the overall input-output map.
arXiv Detail & Related papers (2020-06-08T18:00:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.