Sharper convergence bounds of Monte Carlo Rademacher Averages through
Self-Bounding functions
- URL: http://arxiv.org/abs/2010.12103v2
- Date: Sat, 16 Jan 2021 19:53:02 GMT
- Title: Sharper convergence bounds of Monte Carlo Rademacher Averages through
Self-Bounding functions
- Authors: Leonardo Pellegrina
- Abstract summary: We derive sharper probabilistic concentration bounds for the Monte Carlo Empirical Rademacher Averages.
New results are applicable to yield sharper bounds to (Local) Rademacher Averages.
- Score: 4.518012967046983
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We derive sharper probabilistic concentration bounds for the Monte Carlo
Empirical Rademacher Averages (MCERA), which are proved through recent results
on the concentration of self-bounding functions. Our novel bounds are
characterized by convergence rates that depend on data-dependent characteristic
quantities of the set of functions under consideration, such as the empirical
wimpy variance, an essential improvement w.r.t. standard bounds based on the
methods of bounded differences. For this reason, our new results are applicable
to yield sharper bounds to (Local) Rademacher Averages. We also derive improved
novel variance-dependent bounds for the special case where only one vector of
Rademacher random variables is used to compute the MCERA, through the
application of Bousquet's inequality and novel data-dependent bounds to the
wimpy variance. Then, we leverage the framework of self-bounding functions to
derive novel probabilistic bounds to the supremum deviations, that may be of
independent interest.
Related papers
- Variance-Reducing Couplings for Random Features [57.73648780299374]
Random features (RFs) are a popular technique to scale up kernel methods in machine learning.
We find couplings to improve RFs defined on both Euclidean and discrete input spaces.
We reach surprising conclusions about the benefits and limitations of variance reduction as a paradigm.
arXiv Detail & Related papers (2024-05-26T12:25:09Z) - A distribution-free valid p-value for finite samples of bounded random variables [0.0]
We build a valid p-value based on a concentration inequality for bounded random variables introduced by Pelekis, Ramon and Wang.
The motivation behind this work is the calibration of predictive algorithms in a distribution-free setting.
The ideas presented in this work are also relevant in classical statistical inference.
arXiv Detail & Related papers (2024-05-14T22:01:04Z) - Distribution Estimation under the Infinity Norm [19.997465098927858]
We present novel bounds for estimating discrete probability distributions under the $ell_infty$ norm.
Our data-dependent convergence guarantees for the maximum likelihood estimator significantly improve upon the currently known results.
arXiv Detail & Related papers (2024-02-13T12:49:50Z) - Exact Non-Oblivious Performance of Rademacher Random Embeddings [79.28094304325116]
This paper revisits the performance of Rademacher random projections.
It establishes novel statistical guarantees that are numerically sharp and non-oblivious with respect to the input data.
arXiv Detail & Related papers (2023-03-21T11:45:27Z) - Function-space regularized R\'enyi divergences [6.221019624345409]
We propose a new family of regularized R'enyi divergences parametrized by a variational function space.
We prove several properties of these new divergences, showing that they interpolate between the classical R'enyi divergences and IPMs.
We show that the proposed regularized R'enyi divergences inherit features from IPMs such as the ability to compare distributions that are not absolutely continuous.
arXiv Detail & Related papers (2022-10-10T19:18:04Z) - Data-Driven Influence Functions for Optimization-Based Causal Inference [105.5385525290466]
We study a constructive algorithm that approximates Gateaux derivatives for statistical functionals by finite differencing.
We study the case where probability distributions are not known a priori but need to be estimated from data.
arXiv Detail & Related papers (2022-08-29T16:16:22Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - A Unified Joint Maximum Mean Discrepancy for Domain Adaptation [73.44809425486767]
This paper theoretically derives a unified form of JMMD that is easy to optimize.
From the revealed unified JMMD, we illustrate that JMMD degrades the feature-label dependence that benefits to classification.
We propose a novel MMD matrix to promote the dependence, and devise a novel label kernel that is robust to label distribution shift.
arXiv Detail & Related papers (2021-01-25T09:46:14Z) - Relative Deviation Margin Bounds [55.22251993239944]
We give two types of learning bounds, both distribution-dependent and valid for general families, in terms of the Rademacher complexity.
We derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment.
arXiv Detail & Related papers (2020-06-26T12:37:17Z) - Optimal Bounds between $f$-Divergences and Integral Probability Metrics [8.401473551081748]
Families of $f$-divergences and Integral Probability Metrics are widely used to quantify similarity between probability distributions.
We systematically study the relationship between these two families from the perspective of convex duality.
We obtain new bounds while also recovering in a unified manner well-known results, such as Hoeffding's lemma.
arXiv Detail & Related papers (2020-06-10T17:39:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.