Quantum Graph Transformer for NLP Sentiment Classification
- URL: http://arxiv.org/abs/2506.07937v1
- Date: Mon, 09 Jun 2025 16:55:41 GMT
- Title: Quantum Graph Transformer for NLP Sentiment Classification
- Authors: Shamminuj Aktar, Andreas Bärtschi, Abdel-Hameed A. Badawy, Stephan Eidenbenz,
- Abstract summary: We present the Quantum Graph Transformer (QGT), a hybrid graph-based architecture that integrates a quantum self-attention mechanism.<n>QGT consistently achieves higher or comparable accuracy than existing quantum natural language processing (QNLP) models.<n>These results highlight the potential of graph-based QNLP techniques for advancing efficient and scalable language understanding.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum machine learning is a promising direction for building more efficient and expressive models, particularly in domains where understanding complex, structured data is critical. We present the Quantum Graph Transformer (QGT), a hybrid graph-based architecture that integrates a quantum self-attention mechanism into the message-passing framework for structured language modeling. The attention mechanism is implemented using parameterized quantum circuits (PQCs), which enable the model to capture rich contextual relationships while significantly reducing the number of trainable parameters compared to classical attention mechanisms. We evaluate QGT on five sentiment classification benchmarks. Experimental results show that QGT consistently achieves higher or comparable accuracy than existing quantum natural language processing (QNLP) models, including both attention-based and non-attention-based approaches. When compared with an equivalent classical graph transformer, QGT yields an average accuracy improvement of 5.42% on real-world datasets and 4.76% on synthetic datasets. Additionally, QGT demonstrates improved sample efficiency, requiring nearly 50% fewer labeled samples to reach comparable performance on the Yelp dataset. These results highlight the potential of graph-based QNLP techniques for advancing efficient and scalable language understanding.
Related papers
- An Efficient Quantum Classifier Based on Hamiltonian Representations [50.467930253994155]
Quantum machine learning (QML) is a discipline that seeks to transfer the advantages of quantum computing to data-driven tasks.<n>We propose an efficient approach that circumvents the costs associated with data encoding by mapping inputs to a finite set of Pauli strings.<n>We evaluate our approach on text and image classification tasks, against well-established classical and quantum models.
arXiv Detail & Related papers (2025-04-13T11:49:53Z) - Learning Efficient Positional Encodings with Graph Neural Networks [109.8653020407373]
We introduce PEARL, a novel framework of learnable PEs for graphs.<n>PEARL approximates equivariant functions of eigenvectors with linear complexity, while rigorously establishing its stability and high expressive power.<n>Our analysis demonstrates that PEARL approximates equivariant functions of eigenvectors with linear complexity, while rigorously establishing its stability and high expressive power.
arXiv Detail & Related papers (2025-02-03T07:28:53Z) - Quantum-Enhanced Attention Mechanism in NLP: A Hybrid Classical-Quantum Approach [0.0]
Transformer-based models have achieved remarkable results in natural language processing (NLP) tasks such as text classification and machine translation.<n>This research proposes a hybrid quantum-classical transformer model that integrates a quantum-enhanced attention mechanism to address these limitations.
arXiv Detail & Related papers (2025-01-26T18:29:06Z) - GQWformer: A Quantum-based Transformer for Graph Representation Learning [15.97445757658235]
We propose a novel approach that integrate graph inductive bias into self-attention mechanisms by leveraging quantum technology for structural encoding.<n>We introduce the Graph Quantum Walk Transformer (GQWformer), a groundbreaking GNN framework that utilizes quantum walks on attributed graphs to generate node quantum states.<n>These quantum states encapsulate rich structural attributes and serve as inductive biases for the transformer, thereby enabling the generation of more meaningful attention scores.
arXiv Detail & Related papers (2024-12-03T09:03:04Z) - Quantum Rationale-Aware Graph Contrastive Learning for Jet Discrimination [1.927711700724334]
In high-energy physics, particle jet tagging plays a pivotal role in distinguishing quark from gluon jets.<n>Existing contrastive learning frameworks struggle to leverage rationale-aware augmentations effectively.<n>We show that integrating a quantum rationale generator within our proposed Quantum Rationale-aware Graph Contrastive Learning framework significantly enhances jet discrimination performance.
arXiv Detail & Related papers (2024-11-03T17:36:05Z) - A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Strategic Data Re-Uploads: A Pathway to Improved Quantum Classification Data Re-Uploading Strategies for Improved Quantum Classifier Performance [0.0]
Re-uploading classical information into quantum states multiple times can enhance the accuracy of quantum classifiers.
We demonstrate our approach to two classification patterns: a linear classification pattern (LCP) and a non-linear classification pattern (NLCP)
arXiv Detail & Related papers (2024-05-15T14:28:00Z) - Graph Neural Networks for Parameterized Quantum Circuits Expressibility Estimation [5.074765131677166]
This paper introduces a novel approach for expressibility estimation of quantum circuits using Graph Neural Networks (GNNs)
We demonstrate the predictive power of our GNN model with a dataset consisting of 25,000 samples from the noiseless IBM QASM Simulator and 12,000 samples from three distinct noisy quantum backends.
arXiv Detail & Related papers (2024-05-13T18:26:55Z) - Predicting Expressibility of Parameterized Quantum Circuits using Graph
Neural Network [5.444441239596186]
We propose a novel method based on Graph Neural Networks (GNNs) for predicting the expressibility of Quantum Circuits (PQCs)
By leveraging the graph-based representation of PQCs, our GNN-based model captures intricate relationships between circuit parameters and their resulting expressibility.
Experimental evaluation on a four thousand random PQC dataset and IBM Qiskit's hardware efficient ansatz sets demonstrates the superior performance of our approach.
arXiv Detail & Related papers (2023-09-13T14:08:01Z) - Weight Re-Mapping for Variational Quantum Algorithms [54.854986762287126]
We introduce the concept of weight re-mapping for variational quantum circuits (VQCs)
We employ seven distinct weight re-mapping functions to assess their impact on eight classification datasets.
Our results indicate that weight re-mapping can enhance the convergence speed of the VQC.
arXiv Detail & Related papers (2023-06-09T09:42:21Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Improving Convergence for Quantum Variational Classifiers using Weight
Re-Mapping [60.086820254217336]
In recent years, quantum machine learning has seen a substantial increase in the use of variational quantum circuits (VQCs)
We introduce weight re-mapping for VQCs, to unambiguously map the weights to an interval of length $2pi$.
We demonstrate that weight re-mapping increased test accuracy for the Wine dataset by $10%$ over using unmodified weights.
arXiv Detail & Related papers (2022-12-22T13:23:19Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.