Quantum Neural Network Architectures for Multivariate Time-Series Forecasting
- URL: http://arxiv.org/abs/2510.21168v1
- Date: Fri, 24 Oct 2025 05:44:41 GMT
- Title: Quantum Neural Network Architectures for Multivariate Time-Series Forecasting
- Authors: Sandra Ranilla-Cortina, Diego A. Aranda, Jorge Ballesteros, Jesus Bonilla, Nerea Monrio, ElĂas F. Combarro, Jose Ranilla,
- Abstract summary: We introduce strategies that extend variational quantum circuit models toward the multivariate setting.<n>We also introduce a novel quantum transformer architecture that integrates a quantum self-attention mechanism.<n>We show that quantum-based models may achieve competitive or superior forecasting accuracy with fewer trainable parameters.
- Score: 0.08126281861908966
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we address the challenge of multivariate time-series forecasting using quantum machine learning techniques. We introduce adaptation strategies that extend variational quantum circuit models, traditionally limited to univariate data, toward the multivariate setting, exploring both purely quantum and hybrid quantum-classical formulations. First, we extend and benchmark several VQC-based and hybrid architectures to systematically evaluate their capacity to model cross-variable dependencies. Second, building upon these foundations, we introduce the iQTransformer, a novel quantum transformer architecture that integrates a quantum self-attention mechanism within the iTransformer framework, enabling a quantum-native representation of inter-variable relationships. Third, we provide a comprehensive empirical evaluation on both synthetic and real-world datasets, showing that quantum-based models may achieve competitive or superior forecasting accuracy with fewer trainable parameters and faster convergence than state-of-the-art classical and quantum baselines in some cases. These contributions highlight the potential of quantum-enhanced architectures as efficient and scalable tools for advancing multivariate time-series forecasting.
Related papers
- Quantum Temporal Fusion Transformer [3.530759252061682]
We introduce the Quantum Temporal Fusion Transformer (QTFT), a quantum-enhanced hybrid quantum-classical architecture.<n> QTFT is successfully trained on the forecasting datasets and is capable of accurately predicting future values.<n>Results indicate the prospect of using quantum computing to boost deep learning architectures in complex machine learning tasks.
arXiv Detail & Related papers (2025-08-06T03:21:20Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [60.996803677584424]
Variational Quantum Circuits (VQCs) offer a novel pathway for quantum machine learning.<n>Their practical application is hindered by inherent limitations such as constrained linear expressivity, optimization challenges, and acute sensitivity to quantum hardware noise.<n>This work introduces VQC-MLPNet, a scalable and robust hybrid quantum-classical architecture designed to overcome these obstacles.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - Differentiable Quantum Architecture Search in Quantum-Enhanced Neural Network Parameter Generation [4.358861563008207]
Quantum neural networks (QNNs) have shown promise both empirically and theoretically.<n> Hardware imperfections and limited access to quantum devices pose practical challenges.<n>We propose an automated solution using differentiable optimization.
arXiv Detail & Related papers (2025-05-13T19:01:08Z) - Quantum Adaptive Self-Attention for Quantum Transformer Models [0.0]
We propose Quantum Adaptive Self-Attention (QASA), a novel hybrid architecture that enhances classical Transformer models with a quantum attention mechanism.<n>QASA replaces dot-product attention with a parameterized quantum circuit (PQC) that adaptively captures inter-token relationships in the quantum Hilbert space.<n> Experiments on synthetic time-series tasks demonstrate that QASA achieves faster convergence and superior generalization compared to both standard Transformers and reduced classical variants.
arXiv Detail & Related papers (2025-04-05T02:52:37Z) - A Survey of Quantum Transformers: Architectures, Challenges and Outlooks [82.4736481748099]
Quantum Transformers integrate the representational power of classical Transformers with the computational advantages of quantum computing.<n>Since 2022, research in this area has rapidly expanded, giving rise to diverse technical paradigms and early applications.<n>This paper presents the first comprehensive, systematic, and in-depth survey of quantum Transformer models.
arXiv Detail & Related papers (2025-04-04T05:40:18Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [62.46800898243033]
Recent progress in quantum learning theory prompts a question: can linear properties of a large-qubit circuit be efficiently learned from measurement data generated by varying classical inputs?<n>We prove that the sample complexity scaling linearly in $d$ is required to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.<n>We propose a kernel-based method leveraging classical shadows and truncated trigonometric expansions, enabling a controllable trade-off between prediction accuracy and computational overhead.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Synergy Between Quantum Circuits and Tensor Networks: Short-cutting the
Race to Practical Quantum Advantage [43.3054117987806]
We introduce a scalable procedure for harnessing classical computing resources to provide pre-optimized initializations for quantum circuits.
We show this method significantly improves the trainability and performance of PQCs on a variety of problems.
By demonstrating a means of boosting limited quantum resources using classical computers, our approach illustrates the promise of this synergy between quantum and quantum-inspired models in quantum computing.
arXiv Detail & Related papers (2022-08-29T15:24:03Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z) - Experimental Quantum Generative Adversarial Networks for Image
Generation [93.06926114985761]
We experimentally achieve the learning and generation of real-world hand-written digit images on a superconducting quantum processor.
Our work provides guidance for developing advanced quantum generative models on near-term quantum devices.
arXiv Detail & Related papers (2020-10-13T06:57:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.