Quantum Kernel-Based Long Short-term Memory for Climate Time-Series Forecasting
- URL: http://arxiv.org/abs/2412.08851v1
- Date: Thu, 12 Dec 2024 01:16:52 GMT
- Title: Quantum Kernel-Based Long Short-term Memory for Climate Time-Series Forecasting
- Authors: Yu-Chao Hsu, Nan-Yow Chen, Tai-Yu Li, Po-Heng, Lee, Kuan-Cheng Chen,
- Abstract summary: We present the Quantum Kernel-Based Long short-memory (QK-LSTM) network, which integrates quantum kernel methods into classical LSTM architectures.<n>QK-LSTM captures intricate nonlinear dependencies and temporal dynamics with fewer trainable parameters.
- Score: 0.24739484546803336
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present the Quantum Kernel-Based Long short-memory (QK-LSTM) network, which integrates quantum kernel methods into classical LSTM architectures to enhance predictive accuracy and computational efficiency in climate time-series forecasting tasks, such as Air Quality Index (AQI) prediction. By embedding classical inputs into high-dimensional quantum feature spaces, QK-LSTM captures intricate nonlinear dependencies and temporal dynamics with fewer trainable parameters. Leveraging quantum kernel methods allows for efficient computation of inner products in quantum spaces, addressing the computational challenges faced by classical models and variational quantum circuit-based models. Designed for the Noisy Intermediate-Scale Quantum (NISQ) era, QK-LSTM supports scalable hybrid quantum-classical implementations. Experimental results demonstrate that QK-LSTM outperforms classical LSTM networks in AQI forecasting, showcasing its potential for environmental monitoring and resource-constrained scenarios, while highlighting the broader applicability of quantum-enhanced machine learning frameworks in tackling large-scale, high-dimensional climate datasets.
Related papers
- Toward Practical Quantum Machine Learning: A Novel Hybrid Quantum LSTM for Fraud Detection [0.1398098625978622]
We present a novel hybrid quantum-classical neural network architecture for fraud detection.
By leveraging quantum phenomena such as superposition and entanglement, our model enhances the feature representation of sequential transaction data.
Results demonstrate competitive improvements in accuracy, precision, recall, and F1 score relative to a conventional LSTM baseline.
arXiv Detail & Related papers (2025-04-30T19:09:12Z) - Federated Quantum-Train Long Short-Term Memory for Gravitational Wave Signal [3.360429911727189]
We present Federated QT-LSTM, a novel framework that combines the Quantum-Train (QT) methodology with Long Short-Term Memory (LSTM) networks in a federated learning setup.
By leveraging quantum neural networks (QNNs) to generate classical LSTM model parameters during training, the framework effectively addresses challenges in model compression, scalability, and computational efficiency.
arXiv Detail & Related papers (2025-03-20T11:34:13Z) - Toward Large-Scale Distributed Quantum Long Short-Term Memory with Modular Quantum Computers [5.673361333697935]
We introduce a Distributed Quantum Long Short-Term Memory (QLSTM) framework to address scalability challenges on Noisy Intermediate-Scale Quantum (NISQ) devices.
QLSTM captures long-range temporal dependencies, while a distributed architecture partitions the underlying Variational Quantum Circuits into smaller, manageable subcircuits.
We demonstrate that the distributed QLSTM achieves stable convergence and improved training dynamics compared to classical approaches.
arXiv Detail & Related papers (2025-03-18T10:07:34Z) - Programming Variational Quantum Circuits with Quantum-Train Agent [3.360429911727189]
The Quantum-Train Quantum Fast Weight Programmer (QT-QFWP) framework is proposed, which facilitates the efficient and scalable programming of variational quantum circuits (VQCs)<n>This approach offers a significant advantage over conventional hybrid quantum-classical models by optimizing both quantum and classical parameter management.<n> QT-QFWP outperforms related models in both efficiency and predictive accuracy, providing a pathway toward more practical and cost-effective quantum machine learning applications.
arXiv Detail & Related papers (2024-12-02T06:26:09Z) - Quantum Kernel-Based Long Short-term Memory [0.30723404270319693]
We introduce the Quantum Kernel-Based Long Short-Term Memory (QK-LSTM) network to capture complex, non-linear patterns in sequential data.
This quantum-enhanced architecture demonstrates efficient convergence, robust loss minimization, and model compactness.
Benchmark comparisons reveal that QK-LSTM achieves performance on par with classical LSTM models, yet with fewer parameters.
arXiv Detail & Related papers (2024-11-20T11:39:30Z) - LatentQGAN: A Hybrid QGAN with Classical Convolutional Autoencoder [7.945302052915863]
A potential application of quantum machine learning is to harness the power of quantum computers for generating classical data.
We propose LatentQGAN, a novel quantum model that uses a hybrid quantum-classical GAN coupled with an autoencoder.
arXiv Detail & Related papers (2024-09-22T23:18:06Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Machine-Learning Insights on Entanglement-trainability Correlation of Parameterized Quantum Circuits [17.975555487972166]
Variational quantum algorithms (VQAs) have emerged as the leading strategy to obtain quantum advantage on the current noisy intermediate-scale devices.
Their entanglement-trainability correlation, as the major reason for the barren plateau (BP) phenomenon, poses a challenge to their applications.
In this Letter, we suggest a gate-to-tensor (GTT) encoding method for parameterized quantum circuits (PQCs)
Two long short-term memory networks (L-G networks) are trained to predict both entanglement and trainability.
arXiv Detail & Related papers (2024-06-04T06:28:05Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - A Quantum Kernel Learning Approach to Acoustic Modeling for Spoken
Command Recognition [69.97260364850001]
We propose a quantum kernel learning (QKL) framework to address the inherent data sparsity issues.
We project acoustic features based on classical-to-quantum feature encoding.
arXiv Detail & Related papers (2022-11-02T16:46:23Z) - Synergy Between Quantum Circuits and Tensor Networks: Short-cutting the
Race to Practical Quantum Advantage [43.3054117987806]
We introduce a scalable procedure for harnessing classical computing resources to provide pre-optimized initializations for quantum circuits.
We show this method significantly improves the trainability and performance of PQCs on a variety of problems.
By demonstrating a means of boosting limited quantum resources using classical computers, our approach illustrates the promise of this synergy between quantum and quantum-inspired models in quantum computing.
arXiv Detail & Related papers (2022-08-29T15:24:03Z) - Error mitigation and quantum-assisted simulation in the error corrected
regime [77.34726150561087]
A standard approach to quantum computing is based on the idea of promoting a classically simulable and fault-tolerant set of operations.
We show how the addition of noisy magic resources allows one to boost classical quasiprobability simulations of a quantum circuit.
arXiv Detail & Related papers (2021-03-12T20:58:41Z) - Quantum Long Short-Term Memory [3.675884635364471]
Long short-term memory (LSTM) is a recurrent neural network (RNN) for sequence and temporal dependency data modeling.
We propose a hybrid quantum-classical model of LSTM, which we dub QLSTM.
Our work paves the way toward implementing machine learning algorithms for sequence modeling on noisy intermediate-scale quantum (NISQ) devices.
arXiv Detail & Related papers (2020-09-03T16:41:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.