Adversarial robust weighted Huber regression
- URL: http://arxiv.org/abs/2102.11120v4
- Date: Fri, 24 May 2024 07:32:53 GMT
- Title: Adversarial robust weighted Huber regression
- Authors: Takeyuki Sasai, Hironori Fujisawa,
- Abstract summary: We consider a robust estimation of linear regression coefficients.
We derive an estimation error bound, which depends on the stable rank and the condition number of the covariance matrix.
- Score: 2.0257616108612373
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider a robust estimation of linear regression coefficients. In this note, we focus on the case where the covariates are sampled from an $L$-subGaussian distribution with unknown covariance, the noises are sampled from a distribution with a bounded absolute moment and both covariates and noises may be contaminated by an adversary. We derive an estimation error bound, which depends on the stable rank and the condition number of the covariance matrix of covariates with a polynomial computational complexity of estimation.
Related papers
- Risk and cross validation in ridge regression with correlated samples [72.59731158970894]
We provide training examples for the in- and out-of-sample risks of ridge regression when the data points have arbitrary correlations.
We further extend our analysis to the case where the test point has non-trivial correlations with the training set, setting often encountered in time series forecasting.
We validate our theory across a variety of high dimensional data.
arXiv Detail & Related papers (2024-08-08T17:27:29Z) - Sparse Linear Regression when Noises and Covariates are Heavy-Tailed and Contaminated by Outliers [2.0257616108612373]
We investigate a problem estimating coefficients of linear regression under sparsity assumption.
We consider the situation where not only covariates and noises are sampled from heavy tailed distributions but also contaminated by outliers.
Our estimators can be computed efficiently, and exhibit sharp error bounds.
arXiv Detail & Related papers (2024-08-02T15:33:04Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - A Geometric Unification of Distributionally Robust Covariance Estimators: Shrinking the Spectrum by Inflating the Ambiguity Set [20.166217494056916]
We propose a principled approach to construct covariance estimators without imposing restrictive assumptions.
We show that our robust estimators are efficiently computable and consistent.
Numerical experiments based on synthetic and real data show that our robust estimators are competitive with state-of-the-art estimators.
arXiv Detail & Related papers (2024-05-30T15:01:18Z) - Concentration inequalities for leave-one-out cross validation [1.90365714903665]
We show that estimator stability is enough to show that leave-one-out cross validation is a sound procedure.
We obtain our results by relying on random variables with distribution satisfying the logarithmic Sobolev inequality.
arXiv Detail & Related papers (2022-11-04T14:08:53Z) - Outlier Robust and Sparse Estimation of Linear Regression Coefficients [2.0257616108612373]
We consider outlier-robust and sparse estimation of linear regression coefficients.
Our results present sharper error bounds under weaker assumptions than prior studies that share similar interests with this study.
arXiv Detail & Related papers (2022-08-24T14:56:52Z) - Robust and Sparse Estimation of Linear Regression Coefficients with
Heavy-tailed Noises and Covariates [0.0]
Our estimator can be computed efficiently. Further, our estimation error bound is sharp.
The situation addressed in this paper is that co variables and noises are sampled from heavy-tailed distributions, and the co variables and noises are contaminated by malicious outliers.
arXiv Detail & Related papers (2022-06-15T15:23:24Z) - Optimal variance-reduced stochastic approximation in Banach spaces [114.8734960258221]
We study the problem of estimating the fixed point of a contractive operator defined on a separable Banach space.
We establish non-asymptotic bounds for both the operator defect and the estimation error.
arXiv Detail & Related papers (2022-01-21T02:46:57Z) - Heavy-tailed Streaming Statistical Estimation [58.70341336199497]
We consider the task of heavy-tailed statistical estimation given streaming $p$ samples.
We design a clipped gradient descent and provide an improved analysis under a more nuanced condition on the noise of gradients.
arXiv Detail & Related papers (2021-08-25T21:30:27Z) - Minimax Optimal Estimation of KL Divergence for Continuous Distributions [56.29748742084386]
Esting Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains.
One simple and effective estimator is based on the k nearest neighbor between these samples.
arXiv Detail & Related papers (2020-02-26T16:37:37Z) - Estimating Gradients for Discrete Random Variables by Sampling without
Replacement [93.09326095997336]
We derive an unbiased estimator for expectations over discrete random variables based on sampling without replacement.
We show that our estimator can be derived as the Rao-Blackwellization of three different estimators.
arXiv Detail & Related papers (2020-02-14T14:15:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.