When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing
- URL: http://arxiv.org/abs/2203.03550v1
- Date: Thu, 17 Feb 2022 09:55:21 GMT
- Title: When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing
- Authors: Chao-Han Huck Yang, Jun Qi, Samuel Yen-Chi Chen, Yu Tsao, Pin-Yu Chen
- Abstract summary: This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
- Score: 75.75419308975746
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The rapid development of quantum computing has demonstrated many unique
characteristics of quantum advantages, such as richer feature representation
and more secured protection on model parameters. This work proposes a vertical
federated learning architecture based on variational quantum circuits to
demonstrate the competitive performance of a quantum-enhanced pre-trained BERT
model for text classification. In particular, our proposed hybrid
classical-quantum model consists of a novel random quantum temporal convolution
(QTC) learning framework replacing some layers in the BERT-based decoder. Our
experiments on intent classification show that our proposed BERT-QTC model
attains competitive experimental results in the Snips and ATIS spoken language
datasets. Particularly, the BERT-QTC boosts the performance of the existing
quantum circuit-based language model in two text classification datasets by
1.57% and 1.52% relative improvements. Furthermore, BERT-QTC can be feasibly
deployed on both existing commercial-accessible quantum computation hardware
and CPU-based interface for ensuring data isolation.
Related papers
- Federated Quantum-Train with Batched Parameter Generation [3.697453416360906]
We introduce the Federated Quantum-Train (QT) framework, which integrates the QT model into federated learning.
Our approach significantly reduces qubit usage from 19 to as low as 8 qubits while reducing generalization error.
arXiv Detail & Related papers (2024-09-04T14:39:11Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Disentangling Quantum and Classical Contributions in Hybrid Quantum
Machine Learning Architectures [4.646930308096446]
Hybrid transfer learning solutions have been developed, merging pre-trained classical models with quantum circuits.
It remains unclear how much each component -- classical and quantum -- contributes to the model's results.
We propose a novel hybrid architecture: instead of utilizing a pre-trained network for compression, we employ an autoencoder to derive a compressed version of the input data.
arXiv Detail & Related papers (2023-11-09T18:13:50Z) - A Novel Spatial-Temporal Variational Quantum Circuit to Enable Deep
Learning on NISQ Devices [12.873184000122542]
This paper proposes a novel spatial-temporal design, namely ST-VQC, to integrate non-linearity in quantum learning.
ST-VQC can achieve over 30% accuracy improvement compared with existing VQCs on actual quantum computers.
arXiv Detail & Related papers (2023-07-19T06:17:16Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Adapting Pre-trained Language Models for Quantum Natural Language
Processing [33.86835690434712]
We show that pre-trained representation can bring 50% to 60% increases to the capacity of end-to-end quantum models.
On quantum simulation experiments, the pre-trained representation can bring 50% to 60% increases to the capacity of end-to-end quantum models.
arXiv Detail & Related papers (2023-02-24T14:59:02Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Hybrid quantum-classical classifier based on tensor network and
variational quantum circuit [0.0]
We introduce a hybrid model combining the quantum-inspired tensor networks (TN) and the variational quantum circuits (VQC) to perform supervised learning tasks.
We show that a matrix product state based TN with low bond dimensions performs better than PCA as a feature extractor to compress data for the input of VQCs in the binary classification of MNIST dataset.
arXiv Detail & Related papers (2020-11-30T09:43:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.