Meaning updating of density matrices
- URL: http://arxiv.org/abs/2001.00862v1
- Date: Fri, 3 Jan 2020 15:28:52 GMT
- Title: Meaning updating of density matrices
- Authors: Bob Coecke and Konstantinos Meichanetzidis
- Abstract summary: The DisCoCat model of natural language meaning assigns meaning to a sentence given: (i) the meanings of its words, and (ii) its grammatical structure.
The recently introduced DisCoCirc model extends this to text consisting of multiple sentences.
While in DisCoCat all meanings are fixed, in DisCoCirc each sentence updates meanings of words.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The DisCoCat model of natural language meaning assigns meaning to a sentence
given: (i) the meanings of its words, and, (ii) its grammatical structure. The
recently introduced DisCoCirc model extends this to text consisting of multiple
sentences. While in DisCoCat all meanings are fixed, in DisCoCirc each sentence
updates meanings of words. In this paper we explore different update mechanisms
for DisCoCirc, in the case where meaning is encoded in density matrices---which
come with several advantages as compared to vectors.
Our starting point are two non-commutative update mechanisms, borrowing one
from quantum foundations research, from Leifer and Spekkens. Unfortunately,
neither of these satisfies any desirable algebraic properties, nor are internal
to the meaning category. By passing to double density matrices we do get an
elegant internal diagrammatic update mechanism.
We also show that (commutative) spiders can be cast as an instance of the
Leifer-Spekkens update mechanism. This result is of interest to quantum
foundations, as it bridges the work in Categorical Quantum Mechanics (CQM) with
that on conditional quantum states. Our work also underpins implementation of
text-level natural language processing on quantum hardware (a.k.a. QNLP), for
which exponential space-gain and quadratic speed-up have previously been
identified.
Related papers
- Extending Quantum Perceptrons: Rydberg Devices, Multi-Class Classification, and Error Tolerance [67.77677387243135]
Quantum Neuromorphic Computing (QNC) merges quantum computation with neural computation to create scalable, noise-resilient algorithms for quantum machine learning (QML)
At the core of QNC is the quantum perceptron (QP), which leverages the analog dynamics of interacting qubits to enable universal quantum computation.
arXiv Detail & Related papers (2024-11-13T23:56:20Z) - A bound on the quantum value of all compiled nonlocal games [49.32403970784162]
A cryptographic compiler converts any nonlocal game into an interactive protocol with a single computationally bounded prover.
We establish a quantum soundness result for all compiled two-player nonlocal games.
arXiv Detail & Related papers (2024-08-13T08:11:56Z) - Quantum Algorithms for Compositional Text Processing [1.3654846342364308]
We focus on the recently proposed DisCoCirc framework for natural language, and propose a quantum adaptation, QDisCoCirc.
This is motivated by a compositional approach to rendering AI interpretable.
For the model-native primitive operation of text similarity, we derive quantum algorithms for fault-tolerant quantum computers.
arXiv Detail & Related papers (2024-08-12T11:21:40Z) - Dictionary Learning Improves Patch-Free Circuit Discovery in Mechanistic
Interpretability: A Case Study on Othello-GPT [59.245414547751636]
We propose a circuit discovery framework alternative to activation patching.
Our framework suffers less from out-of-distribution and proves to be more efficient in terms of complexity.
We dig in a small transformer trained on a synthetic task named Othello and find a number of human-understandable fine-grained circuits inside of it.
arXiv Detail & Related papers (2024-02-19T15:04:53Z) - Variational Quantum Classifiers for Natural-Language Text [0.8722210937404288]
We discuss three potential approaches to variational quantum text classifiers (VQTCs)
The first is a weighted bag-of-sentences approach which treats text as a group of independent sentences with task-specific sentence weighting.
The second is a coreference resolution approach which treats text as a consolidation of its member sentences with coreferences among them resolved.
The third approach, on the other hand, is based on the DisCoCirc model which considers both ordering of sentences and interaction of words in composing text meaning from word and sentence meanings.
arXiv Detail & Related papers (2023-03-04T18:00:05Z) - Noisy Quantum Kernel Machines [58.09028887465797]
An emerging class of quantum learning machines is that based on the paradigm of quantum kernels.
We study how dissipation and decoherence affect their performance.
We show that decoherence and dissipation can be seen as an implicit regularization for the quantum kernel machines.
arXiv Detail & Related papers (2022-04-26T09:52:02Z) - Why we should interpret density matrices as moment matrices: the case of
(in)distinguishable particles and the emergence of classical reality [69.62715388742298]
We introduce a formulation of quantum theory (QT) as a general probabilistic theory but expressed via quasi-expectation operators (QEOs)
We will show that QT for both distinguishable and indistinguishable particles can be formulated in this way.
We will show that finitely exchangeable probabilities for a classical dice are as weird as QT.
arXiv Detail & Related papers (2022-03-08T14:47:39Z) - A gentle introduction to Quantum Natural Language Processing [0.0]
The main goal of this master's thesis is to introduce Quantum Natural Language Processing.
QNLP aims at representing sentences' meaning as vectors encoded into quantum computers.
arXiv Detail & Related papers (2022-02-23T20:17:00Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z) - Foundations for Near-Term Quantum Natural Language Processing [0.17205106391379021]
We provide conceptual and mathematical foundations for near-term quantum natural language processing (QNLP)
We recall how the quantum model for natural language that we employ canonically combines linguistic meanings with rich linguistic structure.
We provide references for supporting empirical evidence and formal statements concerning mathematical generality.
arXiv Detail & Related papers (2020-12-07T14:49:33Z) - Quantum Natural Language Processing on Near-Term Quantum Computers [0.0]
We describe a full-stack pipeline for natural language processing on near-term quantum computers, aka QNLP.
DisCoCat is a language-modelling framework that extends and complements the compositional structure of pregroup grammars.
We present a method for mapping DisCoCat diagrams to quantum circuits.
arXiv Detail & Related papers (2020-05-08T16:42:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.