Quantum-Train Long Short-Term Memory: Application on Flood Prediction Problem
- URL: http://arxiv.org/abs/2407.08617v1
- Date: Thu, 11 Jul 2024 15:56:00 GMT
- Title: Quantum-Train Long Short-Term Memory: Application on Flood Prediction Problem
- Authors: Chu-Hsuan Abraham Lin, Chen-Yu Liu, Kuan-Cheng Chen,
- Abstract summary: This study applies the Quantum-Train (QT) technique to a forecasting Long Short-Term Memory (LSTM) model trained by Quantum Machine Learning (QML)
The QT technique, originally successful in the A Matter of Taste challenge at QHack 2024, leverages QML to reduce the number of trainable parameters to a polylogarithmic function of the number of parameters in a classical neural network (NN)
Our approach directly processes classical data without the need for quantum embedding and operates independently of quantum computing resources post-training.
- Score: 0.8192907805418583
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Flood prediction is a critical challenge in the context of climate change, with significant implications for ecosystem preservation, human safety, and infrastructure protection. In this study, we tackle this problem by applying the Quantum-Train (QT) technique to a forecasting Long Short-Term Memory (LSTM) model trained by Quantum Machine Learning (QML) with significant parameter reduction. The QT technique, originally successful in the A Matter of Taste challenge at QHack 2024, leverages QML to reduce the number of trainable parameters to a polylogarithmic function of the number of parameters in a classical neural network (NN). This innovative framework maps classical NN weights to a Hilbert space, altering quantum state probability distributions to adjust NN parameters. Our approach directly processes classical data without the need for quantum embedding and operates independently of quantum computing resources post-training, making it highly practical and accessible for real-world flood prediction applications. This model aims to improve the efficiency of flood forecasts, ultimately contributing to better disaster preparedness and response.
Related papers
- Quantum-Train with Tensor Network Mapping Model and Distributed Circuit Ansatz [0.8192907805418583]
Quantum-Train (QT) is a hybrid quantum-classical machine learning framework.
It maps quantum state measurements to classical neural network weights.
Traditional QT framework employs a multi-layer perceptron (MLP) for this task, but it struggles with scalability and interpretability.
We introduce a distributed circuit ansatz designed for large-scale quantum machine learning with multiple small quantum processing unit nodes.
arXiv Detail & Related papers (2024-09-11T03:51:34Z) - Learning to Program Variational Quantum Circuits with Fast Weights [3.6881738506505988]
This paper introduces the Quantum Fast Weight Programmers (QFWP) as a solution to the temporal or sequential learning challenge.
The proposed QFWP model achieves learning of temporal dependencies without necessitating the use of quantum recurrent neural networks.
Numerical simulations conducted in this study showcase the efficacy of the proposed QFWP model in both time-series prediction and RL tasks.
arXiv Detail & Related papers (2024-02-27T18:53:18Z) - Training Classical Neural Networks by Quantum Machine Learning [9.002305736350833]
This work proposes a training scheme for classical neural networks (NNs) that utilizes the exponentially large Hilbert space of a quantum system.
Unlike existing quantum machine learning (QML) methods, the results obtained from quantum computers using our approach can be directly used on classical computers.
arXiv Detail & Related papers (2024-02-26T10:16:21Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Transition Role of Entangled Data in Quantum Machine Learning [51.6526011493678]
Entanglement serves as the resource to empower quantum computing.
Recent progress has highlighted its positive impact on learning quantum dynamics.
We establish a quantum no-free-lunch (NFL) theorem for learning quantum dynamics using entangled data.
arXiv Detail & Related papers (2023-06-06T08:06:43Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - Synergy Between Quantum Circuits and Tensor Networks: Short-cutting the
Race to Practical Quantum Advantage [43.3054117987806]
We introduce a scalable procedure for harnessing classical computing resources to provide pre-optimized initializations for quantum circuits.
We show this method significantly improves the trainability and performance of PQCs on a variety of problems.
By demonstrating a means of boosting limited quantum resources using classical computers, our approach illustrates the promise of this synergy between quantum and quantum-inspired models in quantum computing.
arXiv Detail & Related papers (2022-08-29T15:24:03Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Quantum Optimization for Training Quantum Neural Networks [16.780058676633914]
We devise a framework for leveraging quantum optimisation algorithms to find optimal parameters of QNNs for certain tasks.
We coherently encode the cost function of QNNs onto relative phases of a superposition state in the Hilbert space of the network parameters.
arXiv Detail & Related papers (2021-03-31T13:06:30Z) - FLIP: A flexible initializer for arbitrarily-sized parametrized quantum
circuits [105.54048699217668]
We propose a FLexible Initializer for arbitrarily-sized Parametrized quantum circuits.
FLIP can be applied to any family of PQCs, and instead of relying on a generic set of initial parameters, it is tailored to learn the structure of successful parameters.
We illustrate the advantage of using FLIP in three scenarios: a family of problems with proven barren plateaus, PQC training to solve max-cut problem instances, and PQC training for finding the ground state energies of 1D Fermi-Hubbard models.
arXiv Detail & Related papers (2021-03-15T17:38:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.