Quantum parallel information exchange (QPIE) hybrid network with transfer learning
- URL: http://arxiv.org/abs/2504.04235v1
- Date: Sat, 05 Apr 2025 17:25:26 GMT
- Title: Quantum parallel information exchange (QPIE) hybrid network with transfer learning
- Authors: Ziqing Guo, Alex Khan, Victor S. Sheng, Shabnam Jabeen, Ziwen Pan,
- Abstract summary: Quantum machine learning (QML) has emerged as an innovative framework with the potential to uncover complex patterns.<n>We introduce quantum parallel information exchange (QPIE) hybrid network, a new non-sequential hybrid classical quantum model architecture.<n>We develop a dynamic gradient selection method that applies the parameter shift rule on quantum processing units.
- Score: 18.43273756128771
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantum machine learning (QML) has emerged as an innovative framework with the potential to uncover complex patterns by leveraging quantum systems ability to simulate and exploit high-dimensional latent spaces, particularly in learning tasks. Quantum neural network (QNN) frameworks are inherently sensitive to the precision of gradient calculations and the computational limitations of current quantum hardware as unitary rotations introduce overhead from complex number computations, and the quantum gate operation speed remains a bottleneck for practical implementations. In this study, we introduce quantum parallel information exchange (QPIE) hybrid network, a new non-sequential hybrid classical quantum model architecture, leveraging quantum transfer learning by feeding pre-trained parameters from classical neural networks into quantum circuits, which enables efficient pattern recognition and temporal series data prediction by utilizing non-clifford parameterized quantum gates thereby enhancing both learning efficiency and representational capacity. Additionally, we develop a dynamic gradient selection method that applies the parameter shift rule on quantum processing units (QPUs) and adjoint differentiation on GPUs. Our results demonstrate model performance exhibiting higher accuracy in ad-hoc benchmarks, lowering approximately 88% convergence rate for extra stochasticity time-series data within 100-steps, and showcasing a more unbaised eigenvalue spectrum of the fisher information matrix on CPU/GPU and IonQ QPU simulators.
Related papers
- Training Hybrid Deep Quantum Neural Network for Reinforced Learning Efficiently [2.7812018782449073]
We present a scalable quantum machine learning architecture that overcomes challenges with efficient backpropagation.<n>Our method highlights that hDQNNs could exhibit potentially improved generalizability compared to purely classical models.
arXiv Detail & Related papers (2025-03-12T07:12:02Z) - Quantum autoencoders for image classification [0.0]
Quantum autoencoders (QAEs) leverage classical optimization solely for parameter tuning.<n>QAEs can serve as efficient classification models with fewer parameters and highlight the potential of utilizing quantum circuits for complete end-to-end learning.
arXiv Detail & Related papers (2025-02-21T07:13:38Z) - Quantum-Train-Based Distributed Multi-Agent Reinforcement Learning [5.673361333697935]
Quantum-Train-Based Distributed Multi-Agent Reinforcement Learning (Dist-QTRL)<n>We introduce Quantum-Train-Based Distributed Multi-Agent Reinforcement Learning (Dist-QTRL)
arXiv Detail & Related papers (2024-12-12T00:51:41Z) - Quantum Pointwise Convolution: A Flexible and Scalable Approach for Neural Network Enhancement [0.0]
We propose a novel architecture, which incorporates pointwise convolution within a quantum neural network framework.<n>By using quantum circuits, we map data to a higher-dimensional space, capturing more complex feature relationships.<n>In experiments, we applied the quantum pointwise convolution layer to classification tasks on the FashionMNIST and CIFAR10 datasets.
arXiv Detail & Related papers (2024-12-02T08:03:59Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine learning framework [48.491303218786044]
TeD-Q is an open-source software framework for quantum machine learning.<n>It seamlessly integrates classical machine learning libraries with quantum simulators.<n>It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Synergy Between Quantum Circuits and Tensor Networks: Short-cutting the
Race to Practical Quantum Advantage [43.3054117987806]
We introduce a scalable procedure for harnessing classical computing resources to provide pre-optimized initializations for quantum circuits.
We show this method significantly improves the trainability and performance of PQCs on a variety of problems.
By demonstrating a means of boosting limited quantum resources using classical computers, our approach illustrates the promise of this synergy between quantum and quantum-inspired models in quantum computing.
arXiv Detail & Related papers (2022-08-29T15:24:03Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Tensor Network Quantum Virtual Machine for Simulating Quantum Circuits
at Exascale [57.84751206630535]
We present a modernized version of the Quantum Virtual Machine (TNQVM) which serves as a quantum circuit simulation backend in the e-scale ACCelerator (XACC) framework.
The new version is based on the general purpose, scalable network processing library, ExaTN, and provides multiple quantum circuit simulators.
By combining the portable XACC quantum processors and the scalable ExaTN backend we introduce an end-to-end virtual development environment which can scale from laptops to future exascale platforms.
arXiv Detail & Related papers (2021-04-21T13:26:42Z) - Variational learning for quantum artificial neural networks [0.0]
We first review a series of recent works describing the implementation of artificial neurons and feed-forward neural networks on quantum processors.
We then present an original realization of efficient individual quantum nodes based on variational unsampling protocols.
While keeping full compatibility with the overall memory-efficient feed-forward architecture, our constructions effectively reduce the quantum circuit depth required to determine the activation probability of single neurons.
arXiv Detail & Related papers (2021-03-03T16:10:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.