QuaRK: A Quantum Reservoir Kernel for Time Series Learning
- URL: http://arxiv.org/abs/2602.13531v1
- Date: Sat, 14 Feb 2026 00:04:52 GMT
- Title: QuaRK: A Quantum Reservoir Kernel for Time Series Learning
- Authors: Abdallah Aaraba, Soumaya Cherkaoui, Ola Ahmad, Shengrui Wang,
- Abstract summary: QuaRK is an end-to-end framework that couples a hardware-realistic quantum reservoir featurizer with a kernel-based readout scheme.<n>We provide learning-theoretic guarantees for dependent temporal data, linking design and resource choices to finite-sample performance.
- Score: 11.195918728130088
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum reservoir computing offers a promising route for time series learning by modelling sequential data via rich quantum dynamics while the only training required happens at the level of a lightweight classical readout. However, studies featuring efficient and implementable quantum reservoir architectures along with model learning guarantees remain scarce in the literature. To close this gap, we introduce QuaRK, an end-to-end framework that couples a hardware-realistic quantum reservoir featurizer with a kernel-based readout scheme. Given a sequence of sample points, the reservoir injects the points one after the other to yield a compact feature vector from efficiently measured k-local observables using classical shadow tomography, after which a classical kernel-based readout learns the target mapping with explicit regularization and fast optimization. The resulting pipeline exposes clear computational knobs -- circuit width and depth as well as the measurement budget -- while preserving the flexibility of kernel methods to model nonlinear temporal functionals and being scalable to high-dimensional data. We further provide learning-theoretic generalization guarantees for dependent temporal data, linking design and resource choices to finite-sample performance, thereby offering principled guidance for building reliable temporal learners. Empirical experiments validate QuaRK and illustrate the predicted interpolation and generalization behaviours on synthetic beta-mixing time series tasks.
Related papers
- Minimal Quantum Reservoirs with Hamiltonian Encoding [72.27323884094953]
We investigate a minimal architecture for quantum reservoir computing based on Hamiltonian encoding.<n>This approach circumvents many of the experimental overheads typically associated with quantum machine learning.
arXiv Detail & Related papers (2025-05-28T16:50:05Z) - Unraveling Quantum Environments: Transformer-Assisted Learning in Lindblad Dynamics [0.0]
We introduce a Transformer-based machine learning framework to infer time-dependent dissipation rates in quantum systems.<n>We demonstrate the effectiveness of our approach on a hierarchy of open quantum models of increasing complexity.<n>Our results suggest that modern machine learning tools can serve as scalable and data-driven alternatives for identifying unknown environments in open quantum systems.
arXiv Detail & Related papers (2025-05-11T10:18:19Z) - Enhancing Foundation Models for Time Series Forecasting via Wavelet-based Tokenization [74.3339999119713]
We develop a wavelet-based tokenizer that allows models to learn complex representations directly in the space of time-localized frequencies.<n>Our method first scales and decomposes the input time series, then thresholds and quantizes the wavelet coefficients, and finally pre-trains an autoregressive model to forecast coefficients for the forecast horizon.
arXiv Detail & Related papers (2024-12-06T18:22:59Z) - Memory-Augmented Hybrid Quantum Reservoir Computing [0.0]
We present a hybrid quantum-classical approach that implements memory through classical post-processing of quantum measurements.
We tested our model on two physical platforms: a fully connected Ising model and a Rydberg atom array.
arXiv Detail & Related papers (2024-09-15T22:44:09Z) - Self-STORM: Deep Unrolled Self-Supervised Learning for Super-Resolution Microscopy [55.2480439325792]
We introduce deep unrolled self-supervised learning, which alleviates the need for such data by training a sequence-specific, model-based autoencoder.
Our proposed method exceeds the performance of its supervised counterparts.
arXiv Detail & Related papers (2024-03-25T17:40:32Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Quantum evolution kernel : Machine learning on graphs with programmable
arrays of qubits [0.0]
We introduce a procedure for measuring the similarity between graph-structured data, based on the time-evolution of a quantum system.
By encoding the topology of the input graph in the Hamiltonian of the system, the evolution produces measurement samples that retain key features of the data.
We show numerically that this scheme performs well compared to standard graph kernels on typical benchmark datasets.
arXiv Detail & Related papers (2021-07-07T14:25:18Z) - A Temporal Kernel Approach for Deep Learning with Continuous-time
Information [18.204325860752768]
Sequential deep learning models such as RNN, causal CNN and attention mechanism do not readily consume continuous-time information.
Discretizing the temporal data, as we show, causes inconsistency even for simple continuous-time processes.
We provide a principled way to characterize continuous-time systems using deep learning tools.
arXiv Detail & Related papers (2021-03-28T20:13:53Z) - Physics Informed Deep Kernel Learning [24.033468062984458]
Physics Informed Deep Kernel Learning (PI-DKL) exploits physics knowledge represented by differential equations with latent sources.
For efficient and effective inference, we marginalize out the latent variables and derive a collapsed model evidence lower bound (ELBO)
arXiv Detail & Related papers (2020-06-08T22:43:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.