Probabilistic Quantum SVM Training on Ising Machine
- URL: http://arxiv.org/abs/2503.16363v1
- Date: Thu, 20 Mar 2025 17:20:26 GMT
- Title: Probabilistic Quantum SVM Training on Ising Machine
- Authors: Haoqi He, Yan Xiao,
- Abstract summary: We propose a probabilistic quantum SVM training framework suitable for Coherent Ising Machines (CIMs)<n>We employ batch processing and multi-batch ensemble strategies, enabling small-scale quantum devices to train SVMs on larger datasets.<n>Our method is validated through simulations and real-machine experiments on binary and multi-class datasets.
- Score: 2.44505480142099
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantum computing holds significant potential to accelerate machine learning algorithms, especially in solving optimization problems like those encountered in Support Vector Machine (SVM) training. However, current QUBO-based Quantum SVM (QSVM) methods rely solely on binary optimal solutions, limiting their ability to identify fuzzy boundaries in data. Additionally, the limited qubit count in contemporary quantum devices constrains training on larger datasets. In this paper, we propose a probabilistic quantum SVM training framework suitable for Coherent Ising Machines (CIMs). By formulating the SVM training problem as a QUBO model, we leverage CIMs' energy minimization capabilities and introduce a Boltzmann distribution-based probabilistic approach to better approximate optimal SVM solutions, enhancing robustness. To address qubit limitations, we employ batch processing and multi-batch ensemble strategies, enabling small-scale quantum devices to train SVMs on larger datasets and support multi-class classification tasks via a one-vs-one approach. Our method is validated through simulations and real-machine experiments on binary and multi-class datasets. On the banknote binary classification dataset, our CIM-based QSVM, utilizing an energy-based probabilistic approach, achieved up to 20% higher accuracy compared to the original QSVM, while training up to $10^4$ times faster than simulated annealing methods. Compared with classical SVM, our approach either matched or reduced training time. On the IRIS three-class dataset, our improved QSVM outperformed existing QSVM models in all key metrics. As quantum technology advances, increased qubit counts are expected to further enhance QSVM performance relative to classical SVM.
Related papers
- EfficientQAT: Efficient Quantization-Aware Training for Large Language Models [50.525259103219256]
quantization-aware training (QAT) offers a solution by reducing memory consumption through low-bit representations with minimal accuracy loss.
We propose Efficient Quantization-Aware Training (EfficientQAT), a more feasible QAT algorithm.
EfficientQAT involves two consecutive phases: Block-wise training of all parameters (Block-AP) and end-to-end training of quantization parameters (E2E-QP)
arXiv Detail & Related papers (2024-07-10T17:53:30Z) - Validating Large-Scale Quantum Machine Learning: Efficient Simulation of Quantum Support Vector Machines Using Tensor Networks [17.80970950814512]
We present an efficient tensor-network-based approach for simulating large-scale quantum circuits.<n>Our simulator successfully handles QSVMs with up to 784 qubits, completing simulations within seconds on a single high-performance GPU.
arXiv Detail & Related papers (2024-05-04T10:37:01Z) - Local Binary and Multiclass SVMs Trained on a Quantum Annealer [0.8399688944263844]
In the last years, with the advent of working quantum annealers, hybrid SVM models characterised by quantum training and classical execution have been introduced.
These models have demonstrated comparable performance to their classical counterparts.
However, they are limited in the training set size due to the restricted connectivity of the current quantum annealers.
arXiv Detail & Related papers (2024-03-13T14:37:00Z) - Multi-class Support Vector Machine with Maximizing Minimum Margin [67.51047882637688]
Support Vector Machine (SVM) is a prominent machine learning technique widely applied in pattern recognition tasks.
We propose a novel method for multi-class SVM that incorporates pairwise class loss considerations and maximizes the minimum margin.
Empirical evaluations demonstrate the effectiveness and superiority of our proposed method over existing multi-classification methods.
arXiv Detail & Related papers (2023-12-11T18:09:55Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - Enhancing Quantum Support Vector Machines through Variational Kernel
Training [0.0]
This paper focuses on the two existing quantum kernel SVM and quantum variational SVM methods.
We present a novel approach that synergizes the strengths of QK-SVM and QV-SVM to enhance accuracy.
Our results demonstrate that QVK-SVM holds tremendous potential as a reliable and transformative tool for QML applications.
arXiv Detail & Related papers (2023-05-10T11:30:43Z) - QSAN: A Near-term Achievable Quantum Self-Attention Network [73.15524926159702]
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features.
A novel Quantum Self-Attention Network (QSAN) is proposed for image classification tasks on near-term quantum devices.
arXiv Detail & Related papers (2022-07-14T12:22:51Z) - Boosting Method for Automated Feature Space Discovery in Supervised
Quantum Machine Learning Models [2.9419410749069255]
Quantum Support Vector Machines (QSVM) have become an important tool in research and applications of quantum kernel methods.
We propose a boosting approach for building ensembles of QSVM models and assess performance improvement across multiple datasets.
arXiv Detail & Related papers (2022-05-24T16:56:22Z) - Handling Imbalanced Classification Problems With Support Vector Machines
via Evolutionary Bilevel Optimization [73.17488635491262]
Support vector machines (SVMs) are popular learning algorithms to deal with binary classification problems.
This article introduces EBCS-SVM: evolutionary bilevel cost-sensitive SVMs.
arXiv Detail & Related papers (2022-04-21T16:08:44Z) - Practical application improvement to Quantum SVM: theory to practice [0.9449650062296824]
We use quantum feature maps to translate data into quantum states and build the SVM kernel out of these quantum states.
We show in experiments that this allows QSVM to perform equally to SVM regardless of the complexity of the data sets.
arXiv Detail & Related papers (2020-12-14T17:19:17Z) - On Coresets for Support Vector Machines [61.928187390362176]
A coreset is a small, representative subset of the original data points.
We show that our algorithm can be used to extend the applicability of any off-the-shelf SVM solver to streaming, distributed, and dynamic data settings.
arXiv Detail & Related papers (2020-02-15T23:25:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.