A Multi-Class SWAP-Test Classifier
- URL: http://arxiv.org/abs/2302.02994v1
- Date: Mon, 6 Feb 2023 18:31:43 GMT
- Title: A Multi-Class SWAP-Test Classifier
- Authors: S M Pillay, I Sinayskiy, E Jembere and F Petruccione
- Abstract summary: This work presents the first multi-class SWAP-Test classifier inspired by its binary predecessor and the use of label states in recent work.
In contrast to previous work, the number of qubits required, the measurement strategy, and the topology of the circuits used is invariant to the number of classes.
Both analytical results and numerical simulations show that this classifier is not only effective when applied to diverse classification problems but also robust to certain conditions of noise.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-class classification problems are fundamental in many varied domains in
research and industry. To solve multi-class classification problems, heuristic
strategies such as One-vs-One or One-vs-All can be employed. However, these
strategies require the number of binary classification models developed to grow
with the number of classes. Recent work in quantum machine learning has seen
the development of multi-class quantum classifiers that circumvent this growth
by learning a mapping between the data and a set of label states. This work
presents the first multi-class SWAP-Test classifier inspired by its binary
predecessor and the use of label states in recent work. With this classifier,
the cost of developing multiple models is avoided. In contrast to previous
work, the number of qubits required, the measurement strategy, and the topology
of the circuits used is invariant to the number of classes. In addition, unlike
other architectures for multi-class quantum classifiers, the state
reconstruction of a single qubit yields sufficient information for multi-class
classification tasks. Both analytical results and numerical simulations show
that this classifier is not only effective when applied to diverse
classification problems but also robust to certain conditions of noise.
Related papers
- Generalization Bounds for Few-Shot Transfer Learning with Pretrained
Classifiers [26.844410679685424]
We study the ability of foundation models to learn representations for classification that are transferable to new, unseen classes.
We show that the few-shot error of the learned feature map on new classes is small in case of class-feature-variability collapse.
arXiv Detail & Related papers (2022-12-23T18:46:05Z) - Anomaly Detection using Ensemble Classification and Evidence Theory [62.997667081978825]
We present a novel approach for novel detection using ensemble classification and evidence theory.
A pool selection strategy is presented to build a solid ensemble classifier.
We use uncertainty for the anomaly detection approach.
arXiv Detail & Related papers (2022-12-23T00:50:41Z) - Bi-directional Feature Reconstruction Network for Fine-Grained Few-Shot
Image Classification [61.411869453639845]
We introduce a bi-reconstruction mechanism that can simultaneously accommodate for inter-class and intra-class variations.
This design effectively helps the model to explore more subtle and discriminative features.
Experimental results on three widely used fine-grained image classification datasets consistently show considerable improvements.
arXiv Detail & Related papers (2022-11-30T16:55:14Z) - Multi-Label Quantification [78.83284164605473]
Quantification, variously called "labelled prevalence estimation" or "learning to quantify", is the supervised learning task of generating predictors of the relative frequencies of the classes of interest in unsupervised data samples.
We propose methods for inferring estimators of class prevalence values that strive to leverage the dependencies among the classes of interest in order to predict their relative frequencies more accurately.
arXiv Detail & Related papers (2022-11-15T11:29:59Z) - Generalization for multiclass classification with overparameterized
linear models [3.3434274586532515]
We show that multiclass classification behaves like binary classification in that, as long as there are not too many classes, it is possible to generalize well.
Besides various technical challenges, it turns out that the key difference from the binary classification setting is that there are relatively fewer positive training examples of each class in the multiclass setting as the number of classes increases.
arXiv Detail & Related papers (2022-06-03T05:52:43Z) - Evolving Multi-Label Fuzzy Classifier [5.53329677986653]
Multi-label classification has attracted much attention in the machine learning community to address the problem of assigning single samples to more than one class at the same time.
We propose an evolving multi-label fuzzy classifier (EFC-ML) which is able to self-adapt and self-evolve its structure with new incoming multi-label samples in an incremental, single-pass manner.
arXiv Detail & Related papers (2022-03-29T08:01:03Z) - Theoretical Insights Into Multiclass Classification: A High-dimensional
Asymptotic View [82.80085730891126]
We provide the first modernally precise analysis of linear multiclass classification.
Our analysis reveals that the classification accuracy is highly distribution-dependent.
The insights gained may pave the way for a precise understanding of other classification algorithms.
arXiv Detail & Related papers (2020-11-16T05:17:29Z) - Few-Shot Learning with Intra-Class Knowledge Transfer [100.87659529592223]
We consider the few-shot classification task with an unbalanced dataset.
Recent works have proposed to solve this task by augmenting the training data of the few-shot classes using generative models.
We propose to leverage the intra-class knowledge from the neighbor many-shot classes with the intuition that neighbor classes share similar statistical information.
arXiv Detail & Related papers (2020-08-22T18:15:38Z) - Making Use of NXt to Nothing: The Effect of Class Imbalances on DGA
Detection Classifiers [3.0969191504482243]
It is unclear whether the inclusion of DGAs for which only a few samples are known to the training sets is beneficial or harmful to the overall performance of the classifiers.
In this paper, we perform a comprehensive analysis of various contextless DGA classifiers, which reveals the high value of a few training samples per class for both classification tasks.
arXiv Detail & Related papers (2020-07-01T07:51:12Z) - Many-Class Few-Shot Learning on Multi-Granularity Class Hierarchy [57.68486382473194]
We study many-class few-shot (MCFS) problem in both supervised learning and meta-learning settings.
In this paper, we leverage the class hierarchy as a prior knowledge to train a coarse-to-fine classifier.
The model, "memory-augmented hierarchical-classification network (MahiNet)", performs coarse-to-fine classification where each coarse class can cover multiple fine classes.
arXiv Detail & Related papers (2020-06-28T01:11:34Z) - Adversarial Multi-Binary Neural Network for Multi-class Classification [19.298875915675502]
We use a multi-task framework to address multi-class classification.
We employ adversarial training to distinguish the class-specific features and the class-agnostic features.
arXiv Detail & Related papers (2020-03-25T02:19:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.