Lower Bounds for Private Estimation of Gaussian Covariance Matrices under All Reasonable Parameter Regimes
- URL: http://arxiv.org/abs/2404.17714v1
- Date: Fri, 26 Apr 2024 22:17:32 GMT
- Title: Lower Bounds for Private Estimation of Gaussian Covariance Matrices under All Reasonable Parameter Regimes
- Authors: Victor S. Portella, Nick Harvey,
- Abstract summary: Our bounds match existing upper bounds in the widest known setting of parameters.
Our analysis relies on the Stein-Haff identity, an extension of the classical Stein's identity used in previous fingerprinting lemma arguments.
- Score: 2.7624021966289605
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We prove lower bounds on the number of samples needed to privately estimate the covariance matrix of a Gaussian distribution. Our bounds match existing upper bounds in the widest known setting of parameters. Our analysis relies on the Stein-Haff identity, an extension of the classical Stein's identity used in previous fingerprinting lemma arguments.
Related papers
- Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Sharp Deviations Bounds for Dirichlet Weighted Sums with Application to
analysis of Bayesian algorithms [30.310960863405267]
We derive sharp non-asymptotic deviation bounds for weighted sums of Dirichlet random variables.
These bounds are based on a novel integral representation of the density of a weighted Dirichlet sum.
arXiv Detail & Related papers (2023-04-06T13:15:51Z) - Quantitative deterministic equivalent of sample covariance matrices with
a general dependence structure [0.0]
We prove quantitative bounds involving both the dimensions and the spectral parameter, in particular allowing it to get closer to the real positive semi-line.
As applications, we obtain a new bound for the convergence in Kolmogorov distance of the empirical spectral distributions of these general models.
arXiv Detail & Related papers (2022-11-23T15:50:31Z) - Instance-Dependent Generalization Bounds via Optimal Transport [51.71650746285469]
Existing generalization bounds fail to explain crucial factors that drive the generalization of modern neural networks.
We derive instance-dependent generalization bounds that depend on the local Lipschitz regularity of the learned prediction function in the data space.
We empirically analyze our generalization bounds for neural networks, showing that the bound values are meaningful and capture the effect of popular regularization methods during training.
arXiv Detail & Related papers (2022-11-02T16:39:42Z) - Targeted Separation and Convergence with Kernel Discrepancies [61.973643031360254]
kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or (ii) control weak convergence to P.
In this article we derive new sufficient and necessary conditions to ensure (i) and (ii)
For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels.
arXiv Detail & Related papers (2022-09-26T16:41:16Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Spectral clustering under degree heterogeneity: a case for the random
walk Laplacian [83.79286663107845]
This paper shows that graph spectral embedding using the random walk Laplacian produces vector representations which are completely corrected for node degree.
In the special case of a degree-corrected block model, the embedding concentrates about K distinct points, representing communities.
arXiv Detail & Related papers (2021-05-03T16:36:27Z) - Tighter expected generalization error bounds via Wasserstein distance [23.52237892358981]
We introduce several expected generalization error bounds based on the Wasserstein distance.
We present full-dataset, single-letter, and random-subset bounds on both the standard setting and the randomized-subsample setting.
We show how various new bounds based on different information measures can be derived from the presented bounds.
arXiv Detail & Related papers (2021-01-22T20:13:59Z) - On the Sample Complexity of Privately Learning Unbounded
High-Dimensional Gaussians [13.517767928653443]
These are the first finite sample upper bounds for general Gaussians which do not impose restrictions on the parameters of the distribution.
From a technical standpoint, we provide analytic tools for arguing the existence of global "locally small" covers from local covers of the space.
Our techniques may prove useful for privately learning other distribution classes which do not possess a finite cover.
arXiv Detail & Related papers (2020-10-19T23:55:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.