A multiclass Q-NLP sentiment analysis experiment using DisCoCat
- URL: http://arxiv.org/abs/2209.03152v1
- Date: Wed, 7 Sep 2022 13:47:35 GMT
- Title: A multiclass Q-NLP sentiment analysis experiment using DisCoCat
- Authors: Victor Martinez, Guilhaume Leroy-Meline
- Abstract summary: We will tackle sentiment analysis in the Noisy Intermediate Scale Computing (NISQ) era, using the DisCoCat model of language.
We will first present the basics of quantum computing and the DisCoCat model.
This will enable us to define a general framework to perform NLP tasks on a quantum computer.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sentiment analysis is a branch of Natural Language Processing (NLP) which
goal is to assign sentiments or emotions to particular sentences or words.
Performing this task is particularly useful for companies wishing to take into
account customer feedback through chatbots or verbatim. This has been done
extensively in the literature using various approaches, ranging from simple
models to deep transformer neural networks. In this paper, we will tackle
sentiment analysis in the Noisy Intermediate Scale Computing (NISQ) era, using
the DisCoCat model of language. We will first present the basics of quantum
computing and the DisCoCat model. This will enable us to define a general
framework to perform NLP tasks on a quantum computer. We will then extend the
two-class classification that was performed by Lorenz et al. (2021) to a
four-class sentiment analysis experiment on a much larger dataset, showing the
scalability of such a framework.
Related papers
- SentiQNF: A Novel Approach to Sentiment Analysis Using Quantum Algorithms and Neuro-Fuzzy Systems [2.9565647484584496]
We propose a novel hybrid approach for sentiment analysis called the Quantum Fuzzy Neural Network (QFNN)
QFNN leverages quantum properties and incorporates a fuzzy layer to overcome the limitations of classical sentiment analysis algorithms.
The proposed approach expedites sentiment data processing and precisely analyses different forms of textual data.
arXiv Detail & Related papers (2024-12-17T09:54:17Z) - Scalable and interpretable quantum natural language processing: an implementation on trapped ions [1.0037949839020768]
We present the first implementation of text-level quantum natural language processing.
We focus on the QDisCoCirc model, which is underpinned by a compositional approach to rendering AI interpretable.
We demonstrate an experiment on Quantinuum's H1-1 trapped-ion quantum processor.
arXiv Detail & Related papers (2024-09-13T12:36:14Z) - Natural Language Processing for Dialects of a Language: A Survey [56.93337350526933]
State-of-the-art natural language processing (NLP) models are trained on massive training corpora, and report a superlative performance on evaluation datasets.
This survey delves into an important attribute of these datasets: the dialect of a language.
Motivated by the performance degradation of NLP models for dialectal datasets and its implications for the equity of language technologies, we survey past research in NLP for dialects in terms of datasets, and approaches.
arXiv Detail & Related papers (2024-01-11T03:04:38Z) - Unifying (Quantum) Statistical and Parametrized (Quantum) Algorithms [65.268245109828]
We take inspiration from Kearns' SQ oracle and Valiant's weak evaluation oracle.
We introduce an extensive yet intuitive framework that yields unconditional lower bounds for learning from evaluation queries.
arXiv Detail & Related papers (2023-10-26T18:23:21Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Design and Implementation of a Quantum Kernel for Natural Language
Processing [0.8702432681310401]
This thesis leverage the DisCoCat model to design a quantum-based kernel function that can be used by a support vector machine (SVM) for NLP tasks.
Two similarity measures were studied: (i) the transition amplitude approach and (ii) the SWAP test.
The explicit model from previous work was used to train word embeddings and achieved a testing accuracy of $93.09 pm 0.01$%.
arXiv Detail & Related papers (2022-05-13T00:45:46Z) - Sentiment analysis in tweets: an assessment study from classical to
modern text representation models [59.107260266206445]
Short texts published on Twitter have earned significant attention as a rich source of information.
Their inherent characteristics, such as the informal, and noisy linguistic style, remain challenging to many natural language processing (NLP) tasks.
This study fulfils an assessment of existing language models in distinguishing the sentiment expressed in tweets by using a rich collection of 22 datasets.
arXiv Detail & Related papers (2021-05-29T21:05:28Z) - QNLP in Practice: Running Compositional Models of Meaning on a Quantum
Computer [0.7194733565949804]
We present results on the first NLP experiments conducted on Noisy Intermediate-Scale Quantum (NISQ) computers.
We create representations for sentences that have a natural mapping to quantum circuits.
We successfully train NLP models that solve simple sentence classification tasks on quantum hardware.
arXiv Detail & Related papers (2021-02-25T13:37:33Z) - Sentiment Analysis for Sinhala Language using Deep Learning Techniques [1.0499611180329804]
This paper presents a much comprehensive study on the use of standard sequence models such as RNN, LSTM, Bi-LSTM, and capsule networks.
A data set of 15059 Sinhala news comments, annotated with these four classes and a corpus consists of 9.48 million tokens are publicly released.
arXiv Detail & Related papers (2020-11-14T12:02:30Z) - NLP-CIC at SemEval-2020 Task 9: Analysing sentiment in code-switching
language using a simple deep-learning classifier [63.137661897716555]
Code-switching is a phenomenon in which two or more languages are used in the same message.
We use a standard convolutional neural network model to predict the sentiment of tweets in a blend of Spanish and English languages.
arXiv Detail & Related papers (2020-09-07T19:57:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.