Sequence Processing with Quantum Tensor Networks
- URL: http://arxiv.org/abs/2308.07865v1
- Date: Tue, 15 Aug 2023 16:33:39 GMT
- Title: Sequence Processing with Quantum Tensor Networks
- Authors: Carys Harvey, Richie Yeung, Konstantinos Meichanetzidis
- Abstract summary: We introduce complex-valued tensor network models for sequence processing motivated by interpretability and resource compression.
We demonstrate experimental results for the task of binary classification of sequences relevant to natural language and bioinformatics.
We also demonstrate their implementation on Quantinuum's H2-1 trapped-ion quantum processor, demonstrating the possibility of efficient sequence processing on near-term quantum devices.
- Score: 1.534667887016089
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce complex-valued tensor network models for sequence processing
motivated by correspondence to probabilistic graphical models, interpretability
and resource compression. Inductive bias is introduced to our models via
network architecture, and is motivated by the correlation structure inherent in
the data, as well as any relevant compositional structure, resulting in
tree-like connectivity. Our models are specifically constructed using
parameterised quantum circuits, widely used in quantum machine learning,
effectively using Hilbert space as a feature space. Furthermore, they are
efficiently trainable due to their tree-like structure. We demonstrate
experimental results for the task of binary classification of sequences from
real-world datasets relevant to natural language and bioinformatics,
characterised by long-range correlations and often equipped with syntactic
information. Since our models have a valid operational interpretation as
quantum processes, we also demonstrate their implementation on Quantinuum's
H2-1 trapped-ion quantum processor, demonstrating the possibility of efficient
sequence processing on near-term quantum devices. This work constitutes the
first scalable implementation of near-term quantum language processing,
providing the tools for large-scale experimentation on the role of tensor
structure and syntactic priors. Finally, this work lays the groundwork for
generative sequence modelling in a hybrid pipeline where the training may be
conducted efficiently in simulation, while sampling from learned probability
distributions may be done with polynomial speed-up on quantum devices.
Related papers
- Scalable and interpretable quantum natural language processing: an implementation on trapped ions [1.0037949839020768]
We present the first implementation of text-level quantum natural language processing.
We focus on the QDisCoCirc model, which is underpinned by a compositional approach to rendering AI interpretable.
We demonstrate an experiment on Quantinuum's H1-1 trapped-ion quantum processor.
arXiv Detail & Related papers (2024-09-13T12:36:14Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Multimodal deep representation learning for quantum cross-platform
verification [60.01590250213637]
Cross-platform verification, a critical undertaking in the realm of early-stage quantum computing, endeavors to characterize the similarity of two imperfect quantum devices executing identical algorithms.
We introduce an innovative multimodal learning approach, recognizing that the formalism of data in this task embodies two distinct modalities.
We devise a multimodal neural network to independently extract knowledge from these modalities, followed by a fusion operation to create a comprehensive data representation.
arXiv Detail & Related papers (2023-11-07T04:35:03Z) - Tensor Networks or Decision Diagrams? Guidelines for Classical Quantum
Circuit Simulation [65.93830818469833]
tensor networks and decision diagrams have independently been developed with differing perspectives, terminologies, and backgrounds in mind.
We consider how these techniques approach classical quantum circuit simulation, and examine their (dis)similarities with regard to their most applicable abstraction level.
We provide guidelines for when to better use tensor networks and when to better use decision diagrams in classical quantum circuit simulation.
arXiv Detail & Related papers (2023-02-13T19:00:00Z) - A performance characterization of quantum generative models [35.974070202997176]
We compare quantum circuits used for quantum generative modeling.
We learn the underlying probability distribution of the data sets via two popular training methods.
We empirically find that a variant of the discrete architecture, which learns the copula of the probability distribution, outperforms all other methods.
arXiv Detail & Related papers (2023-01-23T11:00:29Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Tree tensor network classifiers for machine learning: from
quantum-inspired to quantum-assisted [0.0]
We describe a quantum-assisted machine learning (QAML) method in which multivariate data is encoded into quantum states in a Hilbert space whose dimension is exponentially large in the length of the data vector.
We present an approach that can be implemented on gate-based quantum computing devices.
arXiv Detail & Related papers (2021-04-06T02:31:48Z) - An end-to-end trainable hybrid classical-quantum classifier [0.0]
We introduce a hybrid model combining a quantum-inspired tensor network and a variational quantum circuit to perform supervised learning tasks.
This architecture allows for the classical and quantum parts of the model to be trained simultaneously, providing an end-to-end training framework.
arXiv Detail & Related papers (2021-02-04T05:19:54Z) - Information Scrambling in Computationally Complex Quantum Circuits [56.22772134614514]
We experimentally investigate the dynamics of quantum scrambling on a 53-qubit quantum processor.
We show that while operator spreading is captured by an efficient classical model, operator entanglement requires exponentially scaled computational resources to simulate.
arXiv Detail & Related papers (2021-01-21T22:18:49Z) - Recurrent Quantum Neural Networks [7.6146285961466]
Recurrent neural networks are the foundation of many sequence-to-sequence models in machine learning.
We construct a quantum recurrent neural network (QRNN) with demonstrable performance on non-trivial tasks.
We evaluate the QRNN on MNIST classification, both by feeding the QRNN each image pixel-by-pixel; and by utilising modern data augmentation as preprocessing step.
arXiv Detail & Related papers (2020-06-25T17:59:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.