On the Theoretical Equivalence of Several Trade-Off Curves Assessing
Statistical Proximity
- URL: http://arxiv.org/abs/2006.11809v3
- Date: Thu, 13 Oct 2022 13:32:19 GMT
- Title: On the Theoretical Equivalence of Several Trade-Off Curves Assessing
Statistical Proximity
- Authors: Rodrigue Siry and Ryan Webster and Loic Simon and Julien Rabin
- Abstract summary: We propose a unification of four curves known respectively as: the precision-recall (PR) curve, the Lorenz curve, the receiver operating characteristic (ROC) curve and a special case of R'enyi divergence frontiers.
In addition, we discuss possible links between PR / Lorenz curves with the derivation of domain adaptation bounds.
- Score: 4.626261940793027
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The recent advent of powerful generative models has triggered the renewed
development of quantitative measures to assess the proximity of two probability
distributions. As the scalar Frechet inception distance remains popular,
several methods have explored computing entire curves, which reveal the
trade-off between the fidelity and variability of the first distribution with
respect to the second one. Several of such variants have been proposed
independently and while intuitively similar, their relationship has not yet
been made explicit. In an effort to make the emerging picture of generative
evaluation more clear, we propose a unification of four curves known
respectively as: the precision-recall (PR) curve, the Lorenz curve, the
receiver operating characteristic (ROC) curve and a special case of R\'enyi
divergence frontiers. In addition, we discuss possible links between PR /
Lorenz curves with the derivation of domain adaptation bounds.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Discriminative Estimation of Total Variation Distance: A Fidelity Auditor for Generative Data [10.678533056953784]
We propose a discriminative approach to estimate the total variation (TV) distance between two distributions.
Our method quantitatively characterizes the relation between the Bayes risk in classifying two distributions and their TV distance.
We demonstrate that, with a specific choice of hypothesis class in classification, a fast convergence rate in estimating the TV distance can be achieved.
arXiv Detail & Related papers (2024-05-24T08:18:09Z) - Uncertainty Quantification via Stable Distribution Propagation [60.065272548502]
We propose a new approach for propagating stable probability distributions through neural networks.
Our method is based on local linearization, which we show to be an optimal approximation in terms of total variation distance for the ReLU non-linearity.
arXiv Detail & Related papers (2024-02-13T09:40:19Z) - A U-turn on Double Descent: Rethinking Parameter Counting in Statistical
Learning [68.76846801719095]
We show that double descent appears exactly when and where it occurs, and that its location is not inherently tied to the threshold p=n.
This provides a resolution to tensions between double descent and statistical intuition.
arXiv Detail & Related papers (2023-10-29T12:05:39Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - A numerical approximation method for the Fisher-Rao distance between
multivariate normal distributions [12.729120803225065]
We use discretizing curves joining normal distributions and approximating Rao's distances between successive nearby normal distributions on the curves by the square root of Jeffreys divergence.
We report on our experiments and assess the quality of our approximation technique by comparing the numerical approximations with both lower and upper bounds.
arXiv Detail & Related papers (2023-02-16T09:44:55Z) - Function-space regularized R\'enyi divergences [6.221019624345409]
We propose a new family of regularized R'enyi divergences parametrized by a variational function space.
We prove several properties of these new divergences, showing that they interpolate between the classical R'enyi divergences and IPMs.
We show that the proposed regularized R'enyi divergences inherit features from IPMs such as the ability to compare distributions that are not absolutely continuous.
arXiv Detail & Related papers (2022-10-10T19:18:04Z) - A data-driven approach for the closure of RANS models by the divergence
of the Reynolds Stress Tensor [0.0]
A new data-driven model to close and increase accuracy of RANS equations is proposed.
The choice is driven by the presence of the divergence of RST in the RANS equations.
Once this data-driven approach is trained, there is no need to run any turbulence model to close the equations.
arXiv Detail & Related papers (2022-03-31T11:08:54Z) - The Interplay Between Implicit Bias and Benign Overfitting in Two-Layer
Linear Networks [51.1848572349154]
neural network models that perfectly fit noisy data can generalize well to unseen test data.
We consider interpolating two-layer linear neural networks trained with gradient flow on the squared loss and derive bounds on the excess risk.
arXiv Detail & Related papers (2021-08-25T22:01:01Z) - Divergence Frontiers for Generative Models: Sample Complexity,
Quantization Level, and Frontier Integral [58.434753643798224]
Divergence frontiers have been proposed as an evaluation framework for generative models.
We establish non-asymptotic bounds on the sample complexity of the plug-in estimator of divergence frontiers.
We also augment the divergence frontier framework by investigating the statistical performance of smoothed distribution estimators.
arXiv Detail & Related papers (2021-06-15T06:26:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.