Robust subgaussian estimation with VC-dimension
- URL: http://arxiv.org/abs/2004.11734v3
- Date: Wed, 8 Jul 2020 16:16:13 GMT
- Title: Robust subgaussian estimation with VC-dimension
- Authors: Jules Depersin
- Abstract summary: This work proposes a new general way to bound the excess risk for MOM estimators.
The core technique is the use of VC-dimension (instead of Rademacher complexity) to measure the statistical complexity.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Median-of-means (MOM) based procedures provide non-asymptotic and strong
deviation bounds even when data are heavy-tailed and/or corrupted. This work
proposes a new general way to bound the excess risk for MOM estimators. The
core technique is the use of VC-dimension (instead of Rademacher complexity) to
measure the statistical complexity. In particular, this allows to give the
first robust estimators for sparse estimation which achieves the so-called
subgaussian rate only assuming a finite second moment for the uncorrupted data.
By comparison, previous works using Rademacher complexities required a number
of finite moments that grows logarithmically with the dimension. With this
technique, we derive new robust sugaussian bounds for mean estimation in any
norm. We also derive a new robust estimator for covariance estimation that is
the first to achieve subgaussian bounds without $L_4-L_2$ norm equivalence.
Related papers
- Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Leveraging Nested MLMC for Sequential Neural Posterior Estimation with
Intractable Likelihoods [0.8287206589886881]
SNPE techniques are proposed for dealing with simulation-based models with intractable likelihoods.
In this paper, we propose a nested APT method to estimate the involved nested expectation.
Since the nested estimators for the loss function and its gradient are biased, we make use of unbiased multi-level Monte Carlo (MLMC) estimators.
arXiv Detail & Related papers (2024-01-30T06:29:41Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Stochastic regularized majorization-minimization with weakly convex and
multi-convex surrogates [0.0]
We show that the first optimality gap of the proposed algorithm decays at the rate of the expected loss for various methods under nontens dependent data setting.
We obtain first convergence point for various methods under nontens dependent data setting.
arXiv Detail & Related papers (2022-01-05T15:17:35Z) - Learning to Estimate Without Bias [57.82628598276623]
Gauss theorem states that the weighted least squares estimator is a linear minimum variance unbiased estimation (MVUE) in linear models.
In this paper, we take a first step towards extending this result to non linear settings via deep learning with bias constraints.
A second motivation to BCE is in applications where multiple estimates of the same unknown are averaged for improved performance.
arXiv Detail & Related papers (2021-10-24T10:23:51Z) - Robust Algorithms for GMM Estimation: A Finite Sample Viewpoint [30.839245814393724]
A generic method of solving moment conditions is the Generalized Method of Moments (GMM)
We develop a GMM estimator that can tolerate a constant $ell$ recovery guarantee of $O(sqrtepsilon)$.
Our algorithm and assumptions apply to instrumental variables linear and logistic regression.
arXiv Detail & Related papers (2021-10-06T21:06:22Z) - Model-based multi-parameter mapping [0.0]
Quantitative MR imaging is increasingly favoured for its richer information content and standardised measures.
Estimations often assume noise subsets of data to solve for different quantities in isolation.
Instead, a generative model can be formulated and inverted to jointly recover parameter estimates.
arXiv Detail & Related papers (2021-02-02T17:00:11Z) - Outlier Robust Mean Estimation with Subgaussian Rates via Stability [46.03021473600576]
We study the problem of robust outlier high-dimensional mean estimation.
We obtain first computationally efficient rate with subgaussian for outlier-robust mean estimation.
arXiv Detail & Related papers (2020-07-30T17:33:03Z) - Robust Compressed Sensing using Generative Models [98.64228459705859]
In this paper we propose an algorithm inspired by the Median-of-Means (MOM)
Our algorithm guarantees recovery for heavy-tailed data, even in the presence of outliers.
arXiv Detail & Related papers (2020-06-16T19:07:41Z) - $\gamma$-ABC: Outlier-Robust Approximate Bayesian Computation Based on a
Robust Divergence Estimator [95.71091446753414]
We propose to use a nearest-neighbor-based $gamma$-divergence estimator as a data discrepancy measure.
Our method achieves significantly higher robustness than existing discrepancy measures.
arXiv Detail & Related papers (2020-06-13T06:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.