Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective
- URL: http://arxiv.org/abs/2405.11304v2
- Date: Mon, 10 Jun 2024 16:22:38 GMT
- Title: Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective
- Authors: Chen-Yu Liu, En-Jui Kuo, Chu-Hsuan Abraham Lin, Jason Gemsun Young, Yeong-Jar Chang, Min-Hsiu Hsieh, Hsi-Sheng Goan,
- Abstract summary: We introduce the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with machine learning algorithms.
QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model.
- Score: 7.7063925534143705
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We introduces the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with classical machine learning algorithms to address significant challenges in data encoding, model compression, and inference hardware requirements. Even with a slight decrease in accuracy, QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model, which significantly reduces the parameter count from $M$ to $O(\text{polylog} (M))$ during training. Our experiments demonstrate QT's effectiveness in classification tasks, offering insights into its potential to revolutionize machine learning by leveraging quantum computational advantages. This approach not only improves model efficiency but also reduces generalization errors, showcasing QT's potential across various machine learning applications.
Related papers
- Quantum-Train with Tensor Network Mapping Model and Distributed Circuit Ansatz [0.8192907805418583]
Quantum-Train (QT) is a hybrid quantum-classical machine learning framework.
It maps quantum state measurements to classical neural network weights.
Traditional QT framework employs a multi-layer perceptron (MLP) for this task, but it struggles with scalability and interpretability.
We introduce a distributed circuit ansatz designed for large-scale quantum machine learning with multiple small quantum processing unit nodes.
arXiv Detail & Related papers (2024-09-11T03:51:34Z) - Federated Quantum-Train with Batched Parameter Generation [3.697453416360906]
We introduce the Federated Quantum-Train (QT) framework, which integrates the QT model into federated learning.
Our approach significantly reduces qubit usage from 19 to as low as 8 qubits while reducing generalization error.
arXiv Detail & Related papers (2024-09-04T14:39:11Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - QTRL: Toward Practical Quantum Reinforcement Learning via Quantum-Train [18.138290778243075]
We apply the Quantum-Train method to reinforcement learning tasks, called QTRL, training the classical policy network model.
The training result of the QTRL is a classical model, meaning the inference stage only requires classical computer.
arXiv Detail & Related papers (2024-07-08T16:41:03Z) - Quantum Mixed-State Self-Attention Network [3.1280831148667105]
This paper introduces a novel Quantum Mixed-State Attention Network (QMSAN), which integrates the principles of quantum computing with classical machine learning algorithms.
QMSAN model employs a quantum attention mechanism based on mixed states, enabling efficient direct estimation of similarity between queries and keys within the quantum domain.
Our study investigates the model's robustness in different quantum noise environments, showing that QMSAN possesses commendable robustness to low noise.
arXiv Detail & Related papers (2024-03-05T11:29:05Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.