Robust Density Estimation under Besov IPM Losses
- URL: http://arxiv.org/abs/2004.08597v2
- Date: Mon, 6 Sep 2021 20:28:35 GMT
- Title: Robust Density Estimation under Besov IPM Losses
- Authors: Ananya Uppal, Shashank Singh, Barnabas Poczos
- Abstract summary: We study minimax convergence rates of nonparametric density estimation in the Huber contamination model.
We show that a re-scaled thresholding wavelet series estimator achieves minimax optimal convergence rates under a wide variety of losses.
- Score: 10.079698681921672
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study minimax convergence rates of nonparametric density estimation in the
Huber contamination model, in which a proportion of the data comes from an
unknown outlier distribution. We provide the first results for this problem
under a large family of losses, called Besov integral probability metrics
(IPMs), that includes $\mathcal{L}^p$, Wasserstein, Kolmogorov-Smirnov, and
other common distances between probability distributions. Specifically, under a
range of smoothness assumptions on the population and outlier distributions, we
show that a re-scaled thresholding wavelet series estimator achieves minimax
optimal convergence rates under a wide variety of losses. Finally, based on
connections that have recently been shown between nonparametric density
estimation under IPM losses and generative adversarial networks (GANs), we show
that certain GAN architectures also achieve these minimax rates.
Related papers
- Statistical Estimation Under Distribution Shift: Wasserstein
Perturbations and Minimax Theory [24.540342159350015]
We focus on Wasserstein distribution shifts, where every data point may undergo a slight perturbation.
We consider perturbations that are either independent or coordinated joint shifts across data points.
We analyze several important statistical problems, including location estimation, linear regression, and non-parametric density estimation.
arXiv Detail & Related papers (2023-08-03T16:19:40Z) - Adaptive learning of density ratios in RKHS [3.047411947074805]
Estimating the ratio of two probability densities from finitely many observations is a central problem in machine learning and statistics.
We analyze a large class of density ratio estimation methods that minimize a regularized Bregman divergence between the true density ratio and a model in a reproducing kernel Hilbert space.
arXiv Detail & Related papers (2023-07-30T08:18:39Z) - On minimax density estimation via measure transport [0.0]
We study the convergence properties of nonparametric density estimators based on measure transport.
We show that penalized and unpenalized versions of such estimators achieve minimax optimal convergence rates over H"older classes of densities.
arXiv Detail & Related papers (2022-07-20T23:56:00Z) - Robust Estimation for Nonparametric Families via Generative Adversarial
Networks [92.64483100338724]
We provide a framework for designing Generative Adversarial Networks (GANs) to solve high dimensional robust statistics problems.
Our work extend these to robust mean estimation, second moment estimation, and robust linear regression.
In terms of techniques, our proposed GAN losses can be viewed as a smoothed and generalized Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2022-02-02T20:11:33Z) - A Unified Framework for Multi-distribution Density Ratio Estimation [101.67420298343512]
Binary density ratio estimation (DRE) provides the foundation for many state-of-the-art machine learning algorithms.
We develop a general framework from the perspective of Bregman minimization divergence.
We show that our framework leads to methods that strictly generalize their counterparts in binary DRE.
arXiv Detail & Related papers (2021-12-07T01:23:20Z) - Density Ratio Estimation via Infinitesimal Classification [85.08255198145304]
We propose DRE-infty, a divide-and-conquer approach to reduce Density ratio estimation (DRE) to a series of easier subproblems.
Inspired by Monte Carlo methods, we smoothly interpolate between the two distributions via an infinite continuum of intermediate bridge distributions.
We show that our approach performs well on downstream tasks such as mutual information estimation and energy-based modeling on complex, high-dimensional datasets.
arXiv Detail & Related papers (2021-11-22T06:26:29Z) - Over-the-Air Statistical Estimation [4.082216579462796]
We study schemes and lower bounds for distributed minimax statistical estimation over a Gaussian multiple-access channel (MAC) under squared error loss.
We show that estimation schemes that leverage the physical layer offer a drastic reduction in estimation error over digital schemes relying on a physical-layer abstraction.
arXiv Detail & Related papers (2021-03-06T03:07:22Z) - Robust W-GAN-Based Estimation Under Wasserstein Contamination [8.87135311567798]
We study several estimation problems under a Wasserstein contamination model and present computationally tractable estimators motivated by generative networks (GANs)
Specifically, we analyze properties of Wasserstein GAN-based estimators for adversarial location estimation, covariance matrix estimation, and linear regression.
Our proposed estimators are minimax optimal in many scenarios.
arXiv Detail & Related papers (2021-01-20T05:15:16Z) - Nonparametric Density Estimation from Markov Chains [68.8204255655161]
We introduce a new nonparametric density estimator inspired by Markov Chains, and generalizing the well-known Kernel Density Estor.
Our estimator presents several benefits with respect to the usual ones and can be used straightforwardly as a foundation in all density-based algorithms.
arXiv Detail & Related papers (2020-09-08T18:33:42Z) - Minimax Optimal Estimation of KL Divergence for Continuous Distributions [56.29748742084386]
Esting Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains.
One simple and effective estimator is based on the k nearest neighbor between these samples.
arXiv Detail & Related papers (2020-02-26T16:37:37Z) - Distribution Approximation and Statistical Estimation Guarantees of
Generative Adversarial Networks [82.61546580149427]
Generative Adversarial Networks (GANs) have achieved a great success in unsupervised learning.
This paper provides approximation and statistical guarantees of GANs for the estimation of data distributions with densities in a H"older space.
arXiv Detail & Related papers (2020-02-10T16:47:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.