QKSAN: A Quantum Kernel Self-Attention Network
- URL: http://arxiv.org/abs/2308.13422v2
- Date: Thu, 12 Oct 2023 15:54:30 GMT
- Title: QKSAN: A Quantum Kernel Self-Attention Network
- Authors: Ren-Xin Zhao and Jinjing Shi and Xuelong Li
- Abstract summary: A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
- Score: 53.96779043113156
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Self-Attention Mechanism (SAM) excels at distilling important information
from the interior of data to improve the computational efficiency of models.
Nevertheless, many Quantum Machine Learning (QML) models lack the ability to
distinguish the intrinsic connections of information like SAM, which limits
their effectiveness on massive high-dimensional quantum data. To tackle the
above issue, a Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to
combine the data representation merit of Quantum Kernel Methods (QKM) with the
efficient information extraction capability of SAM. Further, a Quantum Kernel
Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which
ingeniously incorporates the Deferred Measurement Principle (DMP) and
conditional measurement techniques to release half of quantum resources by
mid-circuit measurement, thereby bolstering both feasibility and adaptability.
Simultaneously, the Quantum Kernel Self-Attention Score (QKSAS) with an
exponentially large characterization space is spawned to accommodate more
information and determine the measurement conditions. Eventually, four QKSAN
sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary
classification on MNIST and Fashion MNIST, where the QKSAS tests and
correlation assessments between noise immunity and learning ability are
executed on the best-performing sub-model. The paramount experimental finding
is that a potential learning advantage is revealed in partial QKSAN subclasses
that acquire an impressive more than 98.05% high accuracy with very few
parameters that are much less in aggregate than classical machine learning
models. Predictably, QKSAN lays the foundation for future quantum computers to
perform machine learning on massive amounts of data while driving advances in
areas such as quantum computer vision.
Related papers
- Quantum Mixed-State Self-Attention Network [3.1280831148667105]
This paper introduces a novel Quantum Mixed-State Attention Network (QMSAN), which integrates the principles of quantum computing with classical machine learning algorithms.
QMSAN model employs a quantum attention mechanism based on mixed states, enabling efficient direct estimation of similarity between queries and keys within the quantum domain.
Our study investigates the model's robustness in different quantum noise environments, showing that QMSAN possesses commendable robustness to low noise.
arXiv Detail & Related papers (2024-03-05T11:29:05Z) - GQHAN: A Grover-inspired Quantum Hard Attention Network [53.96779043113156]
Grover-inspired Quantum Hard Attention Mechanism (GQHAM) is proposed.
GQHAN adeptly surmounts the non-differentiability hurdle, surpassing the efficacy of extant quantum soft self-attention mechanisms.
The proposal of GQHAN lays the foundation for future quantum computers to process large-scale data, and promotes the development of quantum computer vision.
arXiv Detail & Related papers (2024-01-25T11:11:16Z) - A natural NISQ model of quantum self-attention mechanism [11.613292674155685]
Self-attention mechanism (SAM) has demonstrated remarkable success in various applications.
Quantum neural networks (QNNs) have been developed as a novel learning model.
We propose a completely natural way of implementing SAM in QNNs, resulting in the quantum self-attention mechanism (QSAM)
arXiv Detail & Related papers (2023-05-25T03:09:17Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Potential and limitations of quantum extreme learning machines [55.41644538483948]
We present a framework to model QRCs and QELMs, showing that they can be concisely described via single effective measurements.
Our analysis paves the way to a more thorough understanding of the capabilities and limitations of both QELMs and QRCs.
arXiv Detail & Related papers (2022-10-03T09:32:28Z) - Deterministic and random features for large-scale quantum kernel machine [0.9404723842159504]
We show that the quantum kernel method (QKM) can be made scalable by using our proposed deterministic and random features.
Our numerical experiment, using datasets including $O(1,000) sim O(10,000)$ training data, supports the validity of our method.
arXiv Detail & Related papers (2022-09-05T13:22:34Z) - QSAN: A Near-term Achievable Quantum Self-Attention Network [73.15524926159702]
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features.
A novel Quantum Self-Attention Network (QSAN) is proposed for image classification tasks on near-term quantum devices.
arXiv Detail & Related papers (2022-07-14T12:22:51Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Quantum machine learning with differential privacy [3.2442879131520126]
We develop a hybrid quantum-classical model that is trained to preserve privacy using differentially private optimization algorithm.
Experiments demonstrate that differentially private QML can protect user-sensitive information without diminishing model accuracy.
arXiv Detail & Related papers (2021-03-10T18:06:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.