Robust and Sparse Estimation of Linear Regression Coefficients with
Heavy-tailed Noises and Covariates
- URL: http://arxiv.org/abs/2206.07594v1
- Date: Wed, 15 Jun 2022 15:23:24 GMT
- Title: Robust and Sparse Estimation of Linear Regression Coefficients with
Heavy-tailed Noises and Covariates
- Authors: Takeyuki Sasai
- Abstract summary: Our estimator can be computed efficiently. Further, our estimation error bound is sharp.
The situation addressed in this paper is that co variables and noises are sampled from heavy-tailed distributions, and the co variables and noises are contaminated by malicious outliers.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Robust and sparse estimation of linear regression coefficients is
investigated. The situation addressed by the present paper is that covariates
and noises are sampled from heavy-tailed distributions, and the covariates and
noises are contaminated by malicious outliers. Our estimator can be computed
efficiently. Further, our estimation error bound is sharp.
Related papers
- Optimal convex $M$-estimation via score matching [6.115859302936817]
We construct a data-driven convex loss function with respect to which empirical risk minimisation yields optimal variance in the downstream estimation of the regression coefficients.
Our semiparametric approach targets the best decreasing approximation of the derivative of the derivative of the log-density of the noise distribution.
arXiv Detail & Related papers (2024-03-25T12:23:19Z) - Robust Covariate Shift Adaptation for Density-Ratio Estimation [10.470114319701576]
We propose a doubly robust estimator for covariate shift adaptation via importance weighting.
Our estimator reduces the bias arising from the density ratio estimation errors.
Notably, our estimator remains consistent if either the density ratio estimator or the regression function is consistent.
arXiv Detail & Related papers (2023-10-25T13:38:29Z) - Outlier Robust and Sparse Estimation of Linear Regression Coefficients [2.0257616108612373]
We consider outlier-robust and sparse estimation of linear regression coefficients.
Our results present sharper error bounds under weaker assumptions than prior studies that share similar interests with this study.
arXiv Detail & Related papers (2022-08-24T14:56:52Z) - Heavy-tailed Streaming Statistical Estimation [58.70341336199497]
We consider the task of heavy-tailed statistical estimation given streaming $p$ samples.
We design a clipped gradient descent and provide an improved analysis under a more nuanced condition on the noise of gradients.
arXiv Detail & Related papers (2021-08-25T21:30:27Z) - Adversarial robust weighted Huber regression [2.0257616108612373]
We consider a robust estimation of linear regression coefficients.
We derive an estimation error bound, which depends on the stable rank and the condition number of the covariance matrix.
arXiv Detail & Related papers (2021-02-22T15:50:34Z) - Robust regression with covariate filtering: Heavy tails and adversarial
contamination [6.939768185086755]
We show how to modify the Huber regression, least trimmed squares, and least absolute deviation estimators to obtain estimators simultaneously computationally and statistically efficient in the stronger contamination model.
We show that the Huber regression estimator achieves near-optimal error rates in this setting, whereas the least trimmed squares and least absolute deviation estimators can be made to achieve near-optimal error after applying a postprocessing step.
arXiv Detail & Related papers (2020-09-27T22:48:48Z) - Outlier Robust Mean Estimation with Subgaussian Rates via Stability [46.03021473600576]
We study the problem of robust outlier high-dimensional mean estimation.
We obtain first computationally efficient rate with subgaussian for outlier-robust mean estimation.
arXiv Detail & Related papers (2020-07-30T17:33:03Z) - $\gamma$-ABC: Outlier-Robust Approximate Bayesian Computation Based on a
Robust Divergence Estimator [95.71091446753414]
We propose to use a nearest-neighbor-based $gamma$-divergence estimator as a data discrepancy measure.
Our method achieves significantly higher robustness than existing discrepancy measures.
arXiv Detail & Related papers (2020-06-13T06:09:27Z) - SUMO: Unbiased Estimation of Log Marginal Probability for Latent
Variable Models [80.22609163316459]
We introduce an unbiased estimator of the log marginal likelihood and its gradients for latent variable models based on randomized truncation of infinite series.
We show that models trained using our estimator give better test-set likelihoods than a standard importance-sampling based approach for the same average computational cost.
arXiv Detail & Related papers (2020-04-01T11:49:30Z) - Minimax Optimal Estimation of KL Divergence for Continuous Distributions [56.29748742084386]
Esting Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains.
One simple and effective estimator is based on the k nearest neighbor between these samples.
arXiv Detail & Related papers (2020-02-26T16:37:37Z) - Estimating Gradients for Discrete Random Variables by Sampling without
Replacement [93.09326095997336]
We derive an unbiased estimator for expectations over discrete random variables based on sampling without replacement.
We show that our estimator can be derived as the Rao-Blackwellization of three different estimators.
arXiv Detail & Related papers (2020-02-14T14:15:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.