Local Binary and Multiclass SVMs Trained on a Quantum Annealer
- URL: http://arxiv.org/abs/2403.08584v1
- Date: Wed, 13 Mar 2024 14:37:00 GMT
- Title: Local Binary and Multiclass SVMs Trained on a Quantum Annealer
- Authors: Enrico Zardini, Amer Delilbasic, Enrico Blanzieri, Gabriele Cavallaro,
Davide Pastorello
- Abstract summary: In the last years, with the advent of working quantum annealers, hybrid SVM models characterised by quantum training and classical execution have been introduced.
These models have demonstrated comparable performance to their classical counterparts.
However, they are limited in the training set size due to the restricted connectivity of the current quantum annealers.
- Score: 0.8399688944263844
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Support vector machines (SVMs) are widely used machine learning models (e.g.,
in remote sensing), with formulations for both classification and regression
tasks. In the last years, with the advent of working quantum annealers, hybrid
SVM models characterised by quantum training and classical execution have been
introduced. These models have demonstrated comparable performance to their
classical counterparts. However, they are limited in the training set size due
to the restricted connectivity of the current quantum annealers. Hence, to take
advantage of large datasets (like those related to Earth observation), a
strategy is required. In the classical domain, local SVMs, namely, SVMs trained
on the data samples selected by a k-nearest neighbors model, have already
proven successful. Here, the local application of quantum-trained SVM models is
proposed and empirically assessed. In particular, this approach allows
overcoming the constraints on the training set size of the quantum-trained
models while enhancing their performance. In practice, the FaLK-SVM method,
designed for efficient local SVMs, has been combined with quantum-trained SVM
models for binary and multiclass classification. In addition, for comparison,
FaLK-SVM has been interfaced for the first time with a classical single-step
multiclass SVM model (CS SVM). Concerning the empirical evaluation, D-Wave's
quantum annealers and real-world datasets taken from the remote sensing domain
have been employed. The results have shown the effectiveness and scalability of
the proposed approach, but also its practical applicability in a real-world
large-scale scenario.
Related papers
- Recursive Learning of Asymptotic Variational Objectives [49.69399307452126]
General state-space models (SSMs) are widely used in statistical machine learning and are among the most classical generative models for sequential time-series data.
Online sequential IWAE (OSIWAE) allows for online learning of both model parameters and a Markovian recognition model for inferring latent states.
This approach is more theoretically well-founded than recently proposed online variational SMC methods.
arXiv Detail & Related papers (2024-11-04T16:12:37Z) - Transferable Post-training via Inverse Value Learning [83.75002867411263]
We propose modeling changes at the logits level during post-training using a separate neural network (i.e., the value network)
After training this network on a small base model using demonstrations, this network can be seamlessly integrated with other pre-trained models during inference.
We demonstrate that the resulting value network has broad transferability across pre-trained models of different parameter sizes.
arXiv Detail & Related papers (2024-10-28T13:48:43Z) - Efficient Training of One Class Classification-SVMs [0.0]
This study examines the use of a highly effective training method to conduct one-class classification.
In this paper, an effective algorithm for dual soft-margin one-class SVM training is presented.
arXiv Detail & Related papers (2023-09-28T15:35:16Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - A Single-Step Multiclass SVM based on Quantum Annealing for Remote
Sensing Data Classification [26.80167258721593]
This work proposes a novel quantum SVM for direct multiclass classification based on quantum annealing, called Quantum Multiclass SVM (QMSVM)
The main objective of this work is to evaluate the feasibility, accuracy, and time performance of this approach.
Experiments have been performed on the D-Wave Advantage quantum annealer for a classification problem on remote sensing data.
arXiv Detail & Related papers (2023-03-21T09:51:19Z) - SWARM Parallelism: Training Large Models Can Be Surprisingly
Communication-Efficient [69.61083127540776]
Deep learning applications benefit from using large models with billions of parameters.
Training these models is notoriously expensive due to the need for specialized HPC clusters.
We consider alternative setups for training large models: using cheap "preemptible" instances or pooling existing resources from multiple regions.
arXiv Detail & Related papers (2023-01-27T18:55:19Z) - Binary classifiers for noisy datasets: a comparative study of existing
quantum machine learning frameworks and some new approaches [0.0]
We apply Quantum Machine Learning frameworks to improve binary classification.
noisy datasets are in financial datasets.
New models exhibit better learning characteristics to asymmetrical noise in the dataset.
arXiv Detail & Related papers (2021-11-05T10:29:05Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - Practical application improvement to Quantum SVM: theory to practice [0.9449650062296824]
We use quantum feature maps to translate data into quantum states and build the SVM kernel out of these quantum states.
We show in experiments that this allows QSVM to perform equally to SVM regardless of the complexity of the data sets.
arXiv Detail & Related papers (2020-12-14T17:19:17Z) - A quantum extension of SVM-perf for training nonlinear SVMs in almost
linear time [0.2855485723554975]
We propose a quantum algorithm for training nonlinear support vector machines (SVM) for feature space learning.
Based on the classical SVM-perf algorithm of Joachims, our algorithm has a running time which scales linearly in the number of training examples.
arXiv Detail & Related papers (2020-06-18T06:25:45Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.