Training Hybrid Deep Quantum Neural Network for Reinforced Learning Efficiently
- URL: http://arxiv.org/abs/2503.09119v2
- Date: Thu, 13 Mar 2025 14:32:13 GMT
- Title: Training Hybrid Deep Quantum Neural Network for Reinforced Learning Efficiently
- Authors: Jie Luo, Xueyin Chen,
- Abstract summary: We present a scalable quantum machine learning architecture that overcomes challenges with efficient backpropagation.<n>Our method highlights that hDQNNs could exhibit potentially improved generalizability compared to purely classical models.
- Score: 2.7812018782449073
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum computing offers a new paradigm for computation, exploiting an exponentially growing Hilbert space for data representation and operation. Computing results are obtained from measurement based stochastic sampling over nontrivial distributions produced by the quantum computing process with complex correlations from quantum entanglement. Quantum machine learning (QML) emerged as a promising method, potentially surpassing classical counterparts in efficiency and accuracy. While large scale fault tolerant quantum (LSFTQ) machines are not yet available, recent works on hybrid quantum machine learning models, compatible with noisy intermediate scale quantum (NISQ) computers, have hinted at improved generalizability. Such hybrid deep quantum neural networks (hDQNNs) integrate GPU CPU based deep neural networks (DNNs) with forward parameterized quantum circuits (PQC) that can be straightforwardly executed on quantum processors. However, backpropagating through forward PQCs remains computationally inefficient if it is simulated on classical hardware, and current quantum hardware constraints impede batched backpropagation even with dedicated PQCs for calculating gradients of forward PQCs, limiting scalability for modern machine learning tasks. Here, we present a scalable quantum machine learning architecture that overcomes these challenges with efficient backpropagation, enabling scalable hDQNN training with PQCs on physical quantum computers or classical PQC simulators. Applied to reinforcement learning benchmarks, our method highlights that hDQNNs could exhibit potentially improved generalizability compared to purely classical models. These findings offer a pathway toward near term hybrid quantum classical computing systems for large scale machine learning and underscore the potential of hybrid quantum classical architectures in advancing artificial intelligence.
Related papers
- Q-Fusion: Diffusing Quantum Circuits [2.348041867134616]
We propose a diffusion-based algorithm leveraging the LayerDAG framework to generate new quantum circuits.
Our results demonstrate that the proposed model consistently generates 100% valid quantum circuit outputs.
arXiv Detail & Related papers (2025-04-29T14:10:10Z) - Quantum parallel information exchange (QPIE) hybrid network with transfer learning [18.43273756128771]
Quantum machine learning (QML) has emerged as an innovative framework with the potential to uncover complex patterns.
We introduce quantum parallel information exchange (QPIE) hybrid network, a new non-sequential hybrid classical quantum model architecture.
We develop a dynamic gradient selection method that applies the parameter shift rule on quantum processing units.
arXiv Detail & Related papers (2025-04-05T17:25:26Z) - Quantum-Train-Based Distributed Multi-Agent Reinforcement Learning [5.673361333697935]
Quantum-Train-Based Distributed Multi-Agent Reinforcement Learning (Dist-QTRL)
We introduce Quantum-Train-Based Distributed Multi-Agent Reinforcement Learning (Dist-QTRL)
arXiv Detail & Related papers (2024-12-12T00:51:41Z) - Leveraging Pre-Trained Neural Networks to Enhance Machine Learning with Variational Quantum Circuits [48.33631905972908]
We introduce an innovative approach that utilizes pre-trained neural networks to enhance Variational Quantum Circuits (VQC)
This technique effectively separates approximation error from qubit count and removes the need for restrictive conditions.
Our results extend to applications such as human genome analysis, demonstrating the broad applicability of our approach.
arXiv Detail & Related papers (2024-11-13T12:03:39Z) - Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective [7.7063925534143705]
We introduce the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with machine learning algorithms.
QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model.
arXiv Detail & Related papers (2024-05-18T14:35:57Z) - Multi-GPU-Enabled Hybrid Quantum-Classical Workflow in Quantum-HPC Middleware: Applications in Quantum Simulations [1.9922905420195367]
This study introduces an innovative distribution-aware Quantum-Classical-Quantum architecture.
It integrates cutting-edge quantum software framework works with high-performance classical computing resources.
It addresses challenges in quantum simulation for materials and condensed matter physics.
arXiv Detail & Related papers (2024-03-09T07:38:45Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Splitting and Parallelizing of Quantum Convolutional Neural Networks for
Learning Translationally Symmetric Data [0.0]
We propose a novel architecture called split-parallelizing QCNN (sp-QCNN)
By splitting the quantum circuit based on translational symmetry, the sp-QCNN can substantially parallelize the conventional QCNN without increasing the number of qubits.
We show that the sp-QCNN can achieve comparable classification accuracy to the conventional QCNN while considerably reducing the measurement resources required.
arXiv Detail & Related papers (2023-06-12T18:00:08Z) - Quafu-RL: The Cloud Quantum Computers based Quantum Reinforcement
Learning [0.0]
In this work, we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on BAQIS Quafu quantum computing cloud.
The experimental results demonstrate that the Reinforcement Learning (RL) agents are capable of achieving goals that are slightly relaxed both during the training and inference stages.
arXiv Detail & Related papers (2023-05-29T09:13:50Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine learning framework [48.491303218786044]
TeD-Q is an open-source software framework for quantum machine learning.<n>It seamlessly integrates classical machine learning libraries with quantum simulators.<n>It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - Synergy Between Quantum Circuits and Tensor Networks: Short-cutting the
Race to Practical Quantum Advantage [43.3054117987806]
We introduce a scalable procedure for harnessing classical computing resources to provide pre-optimized initializations for quantum circuits.
We show this method significantly improves the trainability and performance of PQCs on a variety of problems.
By demonstrating a means of boosting limited quantum resources using classical computers, our approach illustrates the promise of this synergy between quantum and quantum-inspired models in quantum computing.
arXiv Detail & Related papers (2022-08-29T15:24:03Z) - Recent Advances for Quantum Neural Networks in Generative Learning [98.88205308106778]
Quantum generative learning models (QGLMs) may surpass their classical counterparts.
We review the current progress of QGLMs from the perspective of machine learning.
We discuss the potential applications of QGLMs in both conventional machine learning tasks and quantum physics.
arXiv Detail & Related papers (2022-06-07T07:32:57Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Tensor Network Quantum Virtual Machine for Simulating Quantum Circuits
at Exascale [57.84751206630535]
We present a modernized version of the Quantum Virtual Machine (TNQVM) which serves as a quantum circuit simulation backend in the e-scale ACCelerator (XACC) framework.
The new version is based on the general purpose, scalable network processing library, ExaTN, and provides multiple quantum circuit simulators.
By combining the portable XACC quantum processors and the scalable ExaTN backend we introduce an end-to-end virtual development environment which can scale from laptops to future exascale platforms.
arXiv Detail & Related papers (2021-04-21T13:26:42Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.