Projection Valued Measure-based Quantum Machine Learning for Multi-Class
Classification
- URL: http://arxiv.org/abs/2210.16731v1
- Date: Sun, 30 Oct 2022 03:12:53 GMT
- Title: Projection Valued Measure-based Quantum Machine Learning for Multi-Class
Classification
- Authors: Won Joon Yun, Hankyul Baek, and Joongheon Kim
- Abstract summary: We propose a novel framework for multi-class classification using projection-valued measure (PVM)
Our framework outperforms the state-of-theart (SOTA) with various datasets using no more than 6 qubits.
- Score: 10.90994913062223
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In recent years, quantum machine learning (QML) has been actively used for
various tasks, e.g., classification, reinforcement learning, and adversarial
learning. However, these QML studies do not achieve complex tasks due to
scalability issues on input and output are the biggest hurdle in QML. To cope
with this problem, we aim to solve the output scalability issue. Motivated by
this challenge, we focus on projection-valued measure (PVM) which utilizes the
nature of probability amplitude in quantum statistical mechanics. By leveraging
PVM, the output dimension is expanded from the number of qubits $q$ to
$\mathcal{O}(2^q)$. We propose a novel QML framework for multi-class
classification. We corroborate that our framework outperforms the
state-of-theart (SOTA) with various datasets using no more than 6 qubits.
Furthermore, our PVM-based QML outperforms 42.2% SOTA.
Related papers
- Security Concerns in Quantum Machine Learning as a Service [2.348041867134616]
Quantum machine learning (QML) is a category of algorithms that employ variational quantum circuits (VQCs) to tackle machine learning tasks.
Recent discoveries have shown that QML models can effectively generalize from limited training data samples.
QML represents a hybrid model that utilizes both classical and quantum computing resources.
arXiv Detail & Related papers (2024-08-18T18:21:24Z) - EfficientQAT: Efficient Quantization-Aware Training for Large Language Models [50.525259103219256]
quantization-aware training (QAT) offers a solution by reducing memory consumption through low-bit representations with minimal accuracy loss.
We propose Efficient Quantization-Aware Training (EfficientQAT), a more feasible QAT algorithm.
EfficientQAT involves two consecutive phases: Block-wise training of all parameters (Block-AP) and end-to-end training of quantization parameters (E2E-QP)
arXiv Detail & Related papers (2024-07-10T17:53:30Z) - LLMC: Benchmarking Large Language Model Quantization with a Versatile Compression Toolkit [55.73370804397226]
Quantization, a key compression technique, can effectively mitigate these demands by compressing and accelerating large language models.
We present LLMC, a plug-and-play compression toolkit, to fairly and systematically explore the impact of quantization.
Powered by this versatile toolkit, our benchmark covers three key aspects: calibration data, algorithms (three strategies), and data formats.
arXiv Detail & Related papers (2024-05-09T11:49:05Z) - Benchmarking Quantum Generative Learning: A Study on Scalability and Noise Resilience using QUARK [0.3624329910445628]
This paper investigates the scalability and noise resilience of quantum generative learning applications.
We employ rigorous benchmarking techniques to track progress and identify challenges in scaling QML algorithms.
We show that QGANs are not as affected by the curse of dimensionality as QCBMs and to which extent QCBMs are resilient to noise.
arXiv Detail & Related papers (2024-03-27T15:05:55Z) - Unifying (Quantum) Statistical and Parametrized (Quantum) Algorithms [65.268245109828]
We take inspiration from Kearns' SQ oracle and Valiant's weak evaluation oracle.
We introduce an extensive yet intuitive framework that yields unconditional lower bounds for learning from evaluation queries.
arXiv Detail & Related papers (2023-10-26T18:23:21Z) - A Survey on Quantum Machine Learning: Current Trends, Challenges, Opportunities, and the Road Ahead [5.629434388963902]
Quantum Computing (QC) claims to improve the efficiency of solving complex problems, compared to classical computing.
When QC is integrated with Machine Learning (ML), it creates a Quantum Machine Learning (QML) system.
This paper aims to provide a thorough understanding of the foundational concepts of QC and its notable advantages over classical computing.
arXiv Detail & Related papers (2023-10-16T11:52:54Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - Enhancing Quantum Support Vector Machines through Variational Kernel
Training [0.0]
This paper focuses on the two existing quantum kernel SVM and quantum variational SVM methods.
We present a novel approach that synergizes the strengths of QK-SVM and QV-SVM to enhance accuracy.
Our results demonstrate that QVK-SVM holds tremendous potential as a reliable and transformative tool for QML applications.
arXiv Detail & Related papers (2023-05-10T11:30:43Z) - MA2QL: A Minimalist Approach to Fully Decentralized Multi-Agent
Reinforcement Learning [63.46052494151171]
We propose textitmulti-agent alternate Q-learning (MA2QL), where agents take turns to update their Q-functions by Q-learning.
We prove that when each agent guarantees a $varepsilon$-convergence at each turn, their joint policy converges to a Nash equilibrium.
Results show MA2QL consistently outperforms IQL, which verifies the effectiveness of MA2QL, despite such minimal changes.
arXiv Detail & Related papers (2022-09-17T04:54:32Z) - QSAN: A Near-term Achievable Quantum Self-Attention Network [73.15524926159702]
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features.
A novel Quantum Self-Attention Network (QSAN) is proposed for image classification tasks on near-term quantum devices.
arXiv Detail & Related papers (2022-07-14T12:22:51Z) - Subtleties in the trainability of quantum machine learning models [0.0]
We show that gradient scaling results for Variational Quantum Algorithms can be applied to study the gradient scaling of Quantum Machine Learning models.
Our results indicate that features deemed detrimental for VQA trainability can also lead to issues such as barren plateaus in QML.
arXiv Detail & Related papers (2021-10-27T20:28:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.