Quantum NLP models on Natural Language Inference
- URL: http://arxiv.org/abs/2510.15972v1
- Date: Sun, 12 Oct 2025 17:27:26 GMT
- Title: Quantum NLP models on Natural Language Inference
- Authors: Ling Sun, Peter Sullivan, Michael Martin, Yun Zhou,
- Abstract summary: Quantum natural language processing (QNLP) offers a novel approach to semantic modeling.<n>This paper investigates the application of QNLP models to the task of Natural Language Inference (NLI)<n>We construct parameterized quantum circuits for sentence pairs and train them for both semantic relatedness and inference classification.
- Score: 3.119991939664782
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum natural language processing (QNLP) offers a novel approach to semantic modeling by embedding compositional structure directly into quantum circuits. This paper investigates the application of QNLP models to the task of Natural Language Inference (NLI), comparing quantum, hybrid, and classical transformer-based models under a constrained few-shot setting. Using the lambeq library and the DisCoCat framework, we construct parameterized quantum circuits for sentence pairs and train them for both semantic relatedness and inference classification. To assess efficiency, we introduce a novel information-theoretic metric, Information Gain per Parameter (IGPP), which quantifies learning dynamics independent of model size. Our results demonstrate that quantum models achieve performance comparable to classical baselines while operating with dramatically fewer parameters. The Quantum-based models outperform randomly initialized transformers in inference and achieve lower test error on relatedness tasks. Moreover, quantum models exhibit significantly higher per-parameter learning efficiency (up to five orders of magnitude more than classical counterparts), highlighting the promise of QNLP in low-resource, structure-sensitive settings. To address circuit-level isolation and promote parameter sharing, we also propose a novel cluster-based architecture that improves generalization by tying gate parameters to learned word clusters rather than individual tokens.
Related papers
- Quantum LEGO Learning: A Modular Design Principle for Hybrid Artificial Intelligence [63.39968536637762]
We introduce Quantum LEGO Learning, a learning framework that treats classical and quantum components as reusable, composable learning blocks.<n>Within this framework, a pre-trained classical neural network serves as a frozen feature block, while a VQC acts as a trainable adaptive module.<n>We develop a block-wise generalization theory that decomposes learning error into approximation and estimation components.
arXiv Detail & Related papers (2026-01-29T14:29:21Z) - Practical Hybrid Quantum Language Models with Observable Readout on Real Hardware [0.764671395172401]
We present Quantum Recurrent Neural Networks (QRNNs) and Quantum Convolutional Neural Networks (QCNNs) as hybrid quantum language models.<n>Our architecture combines hardware-optimized parametric quantum circuits with a lightweight classical projection layer.<n>Experiments on IBM Quantum processors reveal the critical trade-offs between circuit depth and trainability.
arXiv Detail & Related papers (2025-12-14T14:22:44Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [50.95799256262098]
Variational quantum circuits (VQCs) hold promise for quantum machine learning but face challenges in expressivity, trainability, and noise resilience.<n>We propose VQC-MLPNet, a hybrid architecture where a VQC generates the first-layer weights of a classical multilayer perceptron during training, while inference is performed entirely classically.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - Efficient Generation of Parameterised Quantum Circuits from Large Texts [0.3298092151372303]
DisCoCirc is capable of directly encoding entire documents as parameterised quantum circuits (PQCs)<n>This paper introduces an efficient methodology for converting large-scale texts into quantum circuits using tree-like representations of pregroup diagrams.
arXiv Detail & Related papers (2025-05-19T14:57:53Z) - Quantum Natural Language Processing: A Comprehensive Review of Models, Methods, and Applications [0.05592394503914488]
It is proposed to categorise QNLP models based on quantum computing principles, architecture, and computational approaches.<n>This paper attempts to provide a survey on how quantum meets language by mapping state-of-the-art in this area.
arXiv Detail & Related papers (2025-04-14T06:09:26Z) - Comparative study of the ansätze in quantum language models [12.109572897953413]
Quantum natural language processing (QNLP) methods and frameworks exist for text classification and generation.<n>We evaluate the performance of quantum natural language processing models based on these ans"atze at different levels in text classification tasks.
arXiv Detail & Related papers (2025-02-28T05:49:38Z) - Quantum Neural Networks in Practice: A Comparative Study with Classical Models from Standard Data Sets to Industrial Images [0.5892638927736115]
We compare the performance of randomized classical and quantum neural networks (NNs) as well as classical and quantum-classical hybrid convolutional neural networks (CNNs) for the task of binary image classification.<n>We evaluate these approaches on three data sets of increasing complexity.<n>Cross-dataset performance analysis revealed limited transferability of quantum models between different classification tasks.
arXiv Detail & Related papers (2024-11-28T17:13:45Z) - Multi-Scale Feature Fusion Quantum Depthwise Convolutional Neural Networks for Text Classification [3.0079490585515343]
We propose a novel quantum neural network (QNN) model based on quantum convolution.
We develop the quantum depthwise convolution that significantly reduces the number of parameters and lowers computational complexity.
We also introduce the multi-scale feature fusion mechanism to enhance model performance by integrating word-level and sentence-level features.
arXiv Detail & Related papers (2024-05-22T10:19:34Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Quantum Machine Learning with SQUID [64.53556573827525]
We present the Scaled QUantum IDentifier (SQUID), an open-source framework for exploring hybrid Quantum-Classical algorithms for classification problems.
We provide examples of using SQUID in a standard binary classification problem from the popular MNIST dataset.
arXiv Detail & Related papers (2021-04-30T21:34:11Z) - FLIP: A flexible initializer for arbitrarily-sized parametrized quantum
circuits [105.54048699217668]
We propose a FLexible Initializer for arbitrarily-sized Parametrized quantum circuits.
FLIP can be applied to any family of PQCs, and instead of relying on a generic set of initial parameters, it is tailored to learn the structure of successful parameters.
We illustrate the advantage of using FLIP in three scenarios: a family of problems with proven barren plateaus, PQC training to solve max-cut problem instances, and PQC training for finding the ground state energies of 1D Fermi-Hubbard models.
arXiv Detail & Related papers (2021-03-15T17:38:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.