Nonparametric plug-in classifier for multiclass classification of S.D.E.
paths
- URL: http://arxiv.org/abs/2212.10259v2
- Date: Wed, 27 Sep 2023 21:24:21 GMT
- Title: Nonparametric plug-in classifier for multiclass classification of S.D.E.
paths
- Authors: Christophe Denis, Charlotte Dion-Blanc, Eddy Ella Mintsa and Viet-Chi
Tran
- Abstract summary: We study the multiclass classification problem where the features come from the mixture of time-homogeneous diffusions.
Specifically, the classes are discriminated by their drift functions while the diffusion coefficient is common to all classes and unknown.
- Score: 2.1301560294088318
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the multiclass classification problem where the features come from
the mixture of time-homogeneous diffusions. Specifically, the classes are
discriminated by their drift functions while the diffusion coefficient is
common to all classes and unknown. In this framework, we build a plug-in
classifier which relies on nonparametric estimators of the drift and diffusion
functions. We first establish the consistency of our classification procedure
under mild assumptions and then provide rates of cnvergence under different set
of assumptions. Finally, a numerical study supports our theoretical findings.
Related papers
- Theory of Speciation Transitions in Diffusion Models with General Class Structure [5.939780039158003]
Diffusion models generate data by reversing a diffusion process, transforming noise into structured samples drawn from a target distribution.<n>Recent theoretical work has shown that this backward dynamics can undergo sharp qualitative transitions, known as speciation transitions.<n>We develop a general theory of speciation in diffusion models that applies to arbitrary target distributions admitting well-defined classes.
arXiv Detail & Related papers (2026-02-04T10:35:47Z) - Plug-In Classification of Drift Functions in Diffusion Processes Using Neural Networks [10.520846698070818]
We study a supervised multiclass classification problem for diffusion processes, where each class is characterized by a distinct drift function and trajectories are observed at discrete times.<n>We propose a neural network-based plug-in classifier that estimates the drift functions for each class from independent sample paths and assigns labels based on a Bayes-type decision rule.
arXiv Detail & Related papers (2026-02-02T20:48:01Z) - Generative Classifiers Avoid Shortcut Solutions [84.23247217037134]
Discriminative approaches to classification often learn shortcuts that hold in-distribution but fail under minor distribution shift.<n>We show that generative classifiers can avoid this issue by modeling all features, both core and spurious, instead of mainly spurious ones.<n>We find that diffusion-based and autorerimigressive generative classifiers achieve state-of-the-art performance on five standard image and text distribution shift benchmarks.
arXiv Detail & Related papers (2025-12-31T18:31:46Z) - Empirical risk minimization algorithm for multiclass classification of S.D.E. paths [2.3940819037450987]
We propose a classification algorithm that relies on the minimization of the L 2 risk.
We establish rates of convergence for the resulting predictor.
A simulation study highlights the numerical performance of our classification algorithm.
arXiv Detail & Related papers (2025-03-18T09:06:19Z) - Studying Classifier(-Free) Guidance From a Classifier-Centric Perspective [100.54185280153753]
We find that both classifier guidance and classifier-free guidance achieve conditional generation by pushing the denoising diffusion trajectories away from decision boundaries.
We propose a generic postprocessing step built upon flow-matching to shrink the gap between the learned distribution for a pretrained denoising diffusion model and the real data distribution.
arXiv Detail & Related papers (2025-03-13T17:59:59Z) - Accelerating Convergence in Bayesian Few-Shot Classification [3.819329978428786]
This paper seamlessly integrates mirror descent-based variational inference into Gaussian process-based few-shot classification.
By leveraging non-Euclidean geometry, mirror descent achieves accelerated convergence by providing the steepest descent direction along the corresponding manifold.
arXiv Detail & Related papers (2024-05-02T17:37:39Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Evidential Uncertainty Quantification: A Variance-Based Perspective [0.43536523356694407]
We adapt the variance-based approach from regression to classification, quantifying classification uncertainty at the class level.
Experiments on cross-domain datasets are conducted to illustrate that the variance-based approach not only results in similar accuracy as the entropy-based one in active domain adaptation.
arXiv Detail & Related papers (2023-11-19T16:33:42Z) - Anomaly Detection using Ensemble Classification and Evidence Theory [62.997667081978825]
We present a novel approach for novel detection using ensemble classification and evidence theory.
A pool selection strategy is presented to build a solid ensemble classifier.
We use uncertainty for the anomaly detection approach.
arXiv Detail & Related papers (2022-12-23T00:50:41Z) - Parametric Classification for Generalized Category Discovery: A Baseline
Study [70.73212959385387]
Generalized Category Discovery (GCD) aims to discover novel categories in unlabelled datasets using knowledge learned from labelled samples.
We investigate the failure of parametric classifiers, verify the effectiveness of previous design choices when high-quality supervision is available, and identify unreliable pseudo-labels as a key problem.
We propose a simple yet effective parametric classification method that benefits from entropy regularisation, achieves state-of-the-art performance on multiple GCD benchmarks and shows strong robustness to unknown class numbers.
arXiv Detail & Related papers (2022-11-21T18:47:11Z) - Soft-margin classification of object manifolds [0.0]
A neural population responding to multiple appearances of a single object defines a manifold in the neural response space.
The ability to classify such manifold is of interest, as object recognition and other computational tasks require a response that is insensitive to variability within a manifold.
Soft-margin classifiers are a larger class of algorithms and provide an additional regularization parameter used in applications to optimize performance outside the training set.
arXiv Detail & Related papers (2022-03-14T12:23:36Z) - When in Doubt: Improving Classification Performance with Alternating
Normalization [57.39356691967766]
We introduce Classification with Alternating Normalization (CAN), a non-parametric post-processing step for classification.
CAN improves classification accuracy for challenging examples by re-adjusting their predicted class probability distribution.
We empirically demonstrate its effectiveness across a diverse set of classification tasks.
arXiv Detail & Related papers (2021-09-28T02:55:42Z) - Intra-Class Uncertainty Loss Function for Classification [6.523198497365588]
intra-class uncertainty/variability is not considered, especially for datasets containing unbalanced classes.
In our framework, the features extracted by deep networks of each class are characterized by independent Gaussian distribution.
The proposed approach shows improved classification performance, through learning a better class representation.
arXiv Detail & Related papers (2021-04-12T09:02:41Z) - Theoretical Insights Into Multiclass Classification: A High-dimensional
Asymptotic View [82.80085730891126]
We provide the first modernally precise analysis of linear multiclass classification.
Our analysis reveals that the classification accuracy is highly distribution-dependent.
The insights gained may pave the way for a precise understanding of other classification algorithms.
arXiv Detail & Related papers (2020-11-16T05:17:29Z) - Variational Feature Disentangling for Fine-Grained Few-Shot
Classification [30.350307891161865]
Fine-grained few-shot recognition often suffers from the problem of training data scarcity for novel categories.
In this paper, we focus one enlarging the intra-class variance of the unseen class to improve few-shot classification performance.
arXiv Detail & Related papers (2020-10-07T08:13:42Z) - Saliency-based Weighted Multi-label Linear Discriminant Analysis [101.12909759844946]
We propose a new variant of Linear Discriminant Analysis (LDA) to solve multi-label classification tasks.
The proposed method is based on a probabilistic model for defining the weights of individual samples.
The Saliency-based weighted Multi-label LDA approach is shown to lead to performance improvements in various multi-label classification problems.
arXiv Detail & Related papers (2020-04-08T19:40:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.