A method to integrate and classify normal distributions
- URL: http://arxiv.org/abs/2012.14331v9
- Date: Thu, 29 Jun 2023 22:26:16 GMT
- Title: A method to integrate and classify normal distributions
- Authors: Abhranil Das and Wilson S Geisler
- Abstract summary: We present results and open-source software that provide the probability in any domain of a normal in any dimensions with any parameters.
We demonstrate these tools with vision research applications of detecting occluding objects in natural scenes, and detecting camouflage.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Univariate and multivariate normal probability distributions are widely used
when modeling decisions under uncertainty. Computing the performance of such
models requires integrating these distributions over specific domains, which
can vary widely across models. Besides some special cases, there exist no
general analytical expressions, standard numerical methods or software for
these integrals. Here we present mathematical results and open-source software
that provide (i) the probability in any domain of a normal in any dimensions
with any parameters, (ii) the probability density, cumulative distribution, and
inverse cumulative distribution of any function of a normal vector, (iii) the
classification errors among any number of normal distributions, the
Bayes-optimal discriminability index and relation to the operating
characteristic, (iv) dimension reduction and visualizations for such problems,
and (v) tests for how reliably these methods may be used on given data. We
demonstrate these tools with vision research applications of detecting
occluding objects in natural scenes, and detecting camouflage.
Related papers
- GLAD: Towards Better Reconstruction with Global and Local Adaptive Diffusion Models for Unsupervised Anomaly Detection [60.78684630040313]
Diffusion models tend to reconstruct normal counterparts of test images with certain noises added.
From the global perspective, the difficulty of reconstructing images with different anomalies is uneven.
We propose a global and local adaptive diffusion model (abbreviated to GLAD) for unsupervised anomaly detection.
arXiv Detail & Related papers (2024-06-11T17:27:23Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
This paper presents a succinct derivation of the training and generalization performance of a variety of high-dimensional ridge regression models.
We provide an introduction and review of recent results on these topics, aimed at readers with backgrounds in physics and deep learning.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Anomaly Detection Under Uncertainty Using Distributionally Robust
Optimization Approach [0.9217021281095907]
Anomaly detection is defined as the problem of finding data points that do not follow the patterns of the majority.
The one-class Support Vector Machines (SVM) method aims to find a decision boundary to distinguish between normal data points and anomalies.
A distributionally robust chance-constrained model is proposed in which the probability of misclassification is low.
arXiv Detail & Related papers (2023-12-03T06:13:22Z) - A Heavy-Tailed Algebra for Probabilistic Programming [53.32246823168763]
We propose a systematic approach for analyzing the tails of random variables.
We show how this approach can be used during the static analysis (before drawing samples) pass of a probabilistic programming language compiler.
Our empirical results confirm that inference algorithms that leverage our heavy-tailed algebra attain superior performance across a number of density modeling and variational inference tasks.
arXiv Detail & Related papers (2023-06-15T16:37:36Z) - Characteristic Function of the Tsallis $q$-Gaussian and Its Applications
in Measurement and Metrology [0.0]
The Tsallis $q$-Gaussian distribution is a powerful generalization of the standard Gaussian distribution.
This paper presents the characteristic function of a linear combination of independent $q$-Gaussian random variables.
It provides an alternative computational procedure to the Monte Carlo method for uncertainty analysis.
arXiv Detail & Related papers (2023-03-15T13:42:35Z) - Evidential Softmax for Sparse Multimodal Distributions in Deep
Generative Models [38.26333732364642]
We present $textitev-softmax$, a sparse normalization function that preserves the multimodality of probability distributions.
We evaluate our method on a variety of generative models, including variational autoencoders and auto-regressive architectures.
arXiv Detail & Related papers (2021-10-27T05:32:25Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Interaction Models and Generalized Score Matching for Compositional Data [9.797319790710713]
We propose a class of exponential family models that accommodate general patterns of pairwise interaction while being supported on the probability simplex.
Special cases include the family of Dirichlet distributions as well as Aitchison's additive logistic normal distributions.
A high-dimensional analysis of our estimation methods shows that the simplex domain is handled as efficiently as previously studied full-dimensional domains.
arXiv Detail & Related papers (2021-09-10T05:29:41Z) - Probabilistic Kolmogorov-Arnold Network [1.4732811715354455]
The present paper proposes a method for estimating probability distributions of the outputs in the case of aleatoric uncertainty.
The suggested approach covers input-dependent probability distributions of the outputs, as well as the variation of the distribution type with the inputs.
Although the method is applicable to any regression model, the present paper combines it with KANs, since the specific structure of KANs leads to computationally-efficient models' construction.
arXiv Detail & Related papers (2021-04-04T23:49:15Z) - Accounting for Unobserved Confounding in Domain Generalization [107.0464488046289]
This paper investigates the problem of learning robust, generalizable prediction models from a combination of datasets.
Part of the challenge of learning robust models lies in the influence of unobserved confounders.
We demonstrate the empirical performance of our approach on healthcare data from different modalities.
arXiv Detail & Related papers (2020-07-21T08:18:06Z) - Good Classifiers are Abundant in the Interpolating Regime [64.72044662855612]
We develop a methodology to compute precisely the full distribution of test errors among interpolating classifiers.
We find that test errors tend to concentrate around a small typical value $varepsilon*$, which deviates substantially from the test error of worst-case interpolating model.
Our results show that the usual style of analysis in statistical learning theory may not be fine-grained enough to capture the good generalization performance observed in practice.
arXiv Detail & Related papers (2020-06-22T21:12:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.