Toward Quantum Machine Translation of Syntactically Distinct Languages
- URL: http://arxiv.org/abs/2307.16576v1
- Date: Mon, 31 Jul 2023 11:24:54 GMT
- Title: Toward Quantum Machine Translation of Syntactically Distinct Languages
- Authors: Mina Abbaszade, Mariam Zomorodi, Vahid Salari, Philip Kurian
- Abstract summary: We explore the feasibility of language translation using quantum natural language processing algorithms on noisy intermediate-scale quantum (NISQ) devices.
We employ Shannon entropy to demonstrate the significant role of some appropriate angles of rotation gates in the performance of parametrized quantum circuits.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The present study aims to explore the feasibility of language translation
using quantum natural language processing algorithms on noisy
intermediate-scale quantum (NISQ) devices. Classical methods in natural
language processing (NLP) struggle with handling large-scale computations
required for complex language tasks, but quantum NLP on NISQ devices holds
promise in harnessing quantum parallelism and entanglement to efficiently
process and analyze vast amounts of linguistic data, potentially
revolutionizing NLP applications. Our research endeavors to pave the way for
quantum neural machine translation, which could potentially offer advantages
over classical methods in the future. We employ Shannon entropy to demonstrate
the significant role of some appropriate angles of rotation gates in the
performance of parametrized quantum circuits. In particular, we utilize these
angles (parameters) as a means of communication between quantum circuits of
different languages. To achieve our objective, we adopt the encoder-decoder
model of classical neural networks and implement the translation task using
long short-term memory (LSTM). Our experiments involved 160 samples comprising
English sentences and their Persian translations. We trained the models with
different optimisers implementing stochastic gradient descent (SGD) as primary
and subsequently incorporating two additional optimizers in conjunction with
SGD. Notably, we achieved optimal results-with mean absolute error of 0.03,
mean squared error of 0.002, and 0.016 loss-by training the best model,
consisting of two LSTM layers and using the Adam optimiser. Our small dataset,
though consisting of simple synonymous sentences with word-to-word mappings,
points to the utility of Shannon entropy as a figure of merit in more complex
machine translation models for intricate sentence structures.
Related papers
- Training-efficient density quantum machine learning [2.918930150557355]
Quantum machine learning requires powerful, flexible and efficiently trainable models.
We present density quantum neural networks, a learning model incorporating randomisation over a set of trainable unitaries.
arXiv Detail & Related papers (2024-05-30T16:40:28Z) - Unifying (Quantum) Statistical and Parametrized (Quantum) Algorithms [65.268245109828]
We take inspiration from Kearns' SQ oracle and Valiant's weak evaluation oracle.
We introduce an extensive yet intuitive framework that yields unconditional lower bounds for learning from evaluation queries.
arXiv Detail & Related papers (2023-10-26T18:23:21Z) - Optimizing Quantum Federated Learning Based on Federated Quantum Natural
Gradient Descent [17.05322956052278]
We propose an efficient optimization algorithm, namely federated quantum natural descent (FQNGD)
Compared with gradient descent methods like Adam and Adagrad, the FQNGD algorithm admits much fewer training for the QFL to get converged.
Our experiments on a handwritten digit classification dataset justify the effectiveness of the FQNGD for the QFL framework.
arXiv Detail & Related papers (2023-02-27T11:34:16Z) - Decomposition of Matrix Product States into Shallow Quantum Circuits [62.5210028594015]
tensor network (TN) algorithms can be mapped to parametrized quantum circuits (PQCs)
We propose a new protocol for approximating TN states using realistic quantum circuits.
Our results reveal one particular protocol, involving sequential growth and optimization of the quantum circuit, to outperform all other methods.
arXiv Detail & Related papers (2022-09-01T17:08:41Z) - Near-Term Advances in Quantum Natural Language Processing [0.03298597939573778]
This paper describes experiments showing that some tasks in natural language processing can already be performed using quantum computers.
The first uses an explicit word-based approach, in which word-topic scoring weights are implemented as fractional rotations of individual qubit.
A new phrase is classified based on the accumulation of these weights in a scoring qubit using entangling controlled-NOT gates.
arXiv Detail & Related papers (2022-06-05T13:10:46Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z) - Mixed Precision Low-bit Quantization of Neural Network Language Models
for Speech Recognition [67.95996816744251]
State-of-the-art language models (LMs) represented by long-short term memory recurrent neural networks (LSTM-RNNs) and Transformers are becoming increasingly complex and expensive for practical applications.
Current quantization methods are based on uniform precision and fail to account for the varying performance sensitivity at different parts of LMs to quantization errors.
Novel mixed precision neural network LM quantization methods are proposed in this paper.
arXiv Detail & Related papers (2021-11-29T12:24:02Z) - SML: a new Semantic Embedding Alignment Transformer for efficient
cross-lingual Natural Language Inference [71.57324258813674]
The ability of Transformers to perform with precision a variety of tasks such as question answering, Natural Language Inference (NLI) or summarising, have enable them to be ranked as one of the best paradigms to address this kind of tasks at present.
NLI is one of the best scenarios to test these architectures, due to the knowledge required to understand complex sentences and established a relation between a hypothesis and a premise.
In this paper, we propose a new architecture, siamese multilingual transformer, to efficiently align multilingual embeddings for Natural Language Inference.
arXiv Detail & Related papers (2021-03-17T13:23:53Z) - QNLP in Practice: Running Compositional Models of Meaning on a Quantum
Computer [0.7194733565949804]
We present results on the first NLP experiments conducted on Noisy Intermediate-Scale Quantum (NISQ) computers.
We create representations for sentences that have a natural mapping to quantum circuits.
We successfully train NLP models that solve simple sentence classification tasks on quantum hardware.
arXiv Detail & Related papers (2021-02-25T13:37:33Z) - Grammar-Aware Question-Answering on Quantum Computers [0.17205106391379021]
We perform the first implementation of an NLP task on noisy intermediate-scale quantum (NISQ) hardware.
We encode word-meanings in quantum states and we explicitly account for grammatical structure.
Our novel QNLP model shows concrete promise for scalability as the quality of the quantum hardware improves.
arXiv Detail & Related papers (2020-12-07T14:49:34Z) - Improving Massively Multilingual Neural Machine Translation and
Zero-Shot Translation [81.7786241489002]
Massively multilingual models for neural machine translation (NMT) are theoretically attractive, but often underperform bilingual models and deliver poor zero-shot translations.
We argue that multilingual NMT requires stronger modeling capacity to support language pairs with varying typological characteristics.
We propose random online backtranslation to enforce the translation of unseen training language pairs.
arXiv Detail & Related papers (2020-04-24T17:21:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.