Quantum Vision Transformers
- URL: http://arxiv.org/abs/2209.08167v2
- Date: Tue, 20 Feb 2024 13:26:44 GMT
- Title: Quantum Vision Transformers
- Authors: El Amine Cherrat, Iordanis Kerenidis, Natansh Mathur, Jonas Landman,
Martin Strahm, and Yun Yvonna Li
- Abstract summary: We introduce three types of quantum transformers for training and inference, including a quantum transformer based on compound matrices.
We performed extensive simulations of the quantum transformers on standard medical image datasets that showed competitively.
We implemented our quantum transformers on superconducting quantum computers and obtained encouraging results for up to six qubit experiments.
- Score: 2.3558144417896583
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, quantum transformers are designed and analysed in detail by
extending the state-of-the-art classical transformer neural network
architectures known to be very performant in natural language processing and
image analysis. Building upon the previous work, which uses parametrised
quantum circuits for data loading and orthogonal neural layers, we introduce
three types of quantum transformers for training and inference, including a
quantum transformer based on compound matrices, which guarantees a theoretical
advantage of the quantum attention mechanism compared to their classical
counterpart both in terms of asymptotic run time and the number of model
parameters. These quantum architectures can be built using shallow quantum
circuits and produce qualitatively different classification models. The three
proposed quantum attention layers vary on the spectrum between closely
following the classical transformers and exhibiting more quantum
characteristics. As building blocks of the quantum transformer, we propose a
novel method for loading a matrix as quantum states as well as two new
trainable quantum orthogonal layers adaptable to different levels of
connectivity and quality of quantum computers. We performed extensive
simulations of the quantum transformers on standard medical image datasets that
showed competitively, and at times better performance compared to the classical
benchmarks, including the best-in-class classical vision transformers. The
quantum transformers we trained on these small-scale datasets require fewer
parameters compared to standard classical benchmarks. Finally, we implemented
our quantum transformers on superconducting quantum computers and obtained
encouraging results for up to six qubit experiments.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Quantum Transfer Learning for MNIST Classification Using a Hybrid Quantum-Classical Approach [0.0]
This research explores the integration of quantum computing with classical machine learning for image classification tasks.
We propose a hybrid quantum-classical approach that leverages the strengths of both paradigms.
The experimental results indicate that while the hybrid model demonstrates the feasibility of integrating quantum computing with classical techniques, the accuracy of the final model, trained on quantum outcomes, is currently lower than the classical model trained on compressed features.
arXiv Detail & Related papers (2024-08-05T22:16:27Z) - Supervised binary classification of small-scale digits images with a trapped-ion quantum processor [56.089799129458875]
We show that a quantum processor can correctly solve the basic classification task considered.
With the increase of the capabilities quantum processors, they can become a useful tool for machine learning.
arXiv Detail & Related papers (2024-06-17T18:20:51Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - A Comparative Analysis of Hybrid-Quantum Classical Neural Networks [5.247197295547863]
This paper performs an extensive comparative analysis between different hybrid quantum-classical machine learning algorithms for image classification.
The performance comparison of the hybrid models, based on the accuracy, provides us with an understanding of hybrid quantum-classical convergence in correlation with the quantum layer count and the qubit count variations in the circuit.
arXiv Detail & Related papers (2024-02-16T09:59:44Z) - Quantum-classical simulation of quantum field theory by quantum circuit
learning [0.0]
We employ quantum circuit learning to simulate quantum field theories (QFTs)
We find that our predictions closely align with the results of rigorous classical calculations.
This hybrid quantum-classical approach illustrates the feasibility of efficiently simulating large-scale QFTs on cutting-edge quantum devices.
arXiv Detail & Related papers (2023-11-27T20:18:39Z) - Quantum Neural Architecture Search with Quantum Circuits Metric and
Bayesian Optimization [2.20200533591633]
We propose a new quantum gates distance that characterizes the gates' action over every quantum state.
Our approach significantly outperforms the benchmark on three empirical quantum machine learning problems.
arXiv Detail & Related papers (2022-06-28T16:23:24Z) - Tunable photon-mediated interactions between spin-1 systems [68.8204255655161]
We show how to harness multi-level emitters with several optical transitions to engineer photon-mediated interactions between effective spin-1 systems.
Our results expand the quantum simulation toolbox available in cavity QED and quantum nanophotonic setups.
arXiv Detail & Related papers (2022-06-03T14:52:34Z) - Multiclass classification using quantum convolutional neural networks
with hybrid quantum-classical learning [0.5999777817331318]
We propose a quantum machine learning approach based on quantum convolutional neural networks for solving multiclass classification problems.
We use the proposed approach to demonstrate the 4-class classification for the case of the MNIST dataset using eight qubits for data encoding and four acnilla qubits.
Our results demonstrate comparable accuracy of our solution with classical convolutional neural networks with comparable numbers of trainable parameters.
arXiv Detail & Related papers (2022-03-29T09:07:18Z) - Information Scrambling in Computationally Complex Quantum Circuits [56.22772134614514]
We experimentally investigate the dynamics of quantum scrambling on a 53-qubit quantum processor.
We show that while operator spreading is captured by an efficient classical model, operator entanglement requires exponentially scaled computational resources to simulate.
arXiv Detail & Related papers (2021-01-21T22:18:49Z) - Experimental Quantum Generative Adversarial Networks for Image
Generation [93.06926114985761]
We experimentally achieve the learning and generation of real-world hand-written digit images on a superconducting quantum processor.
Our work provides guidance for developing advanced quantum generative models on near-term quantum devices.
arXiv Detail & Related papers (2020-10-13T06:57:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.