Private Covariance Approximation and Eigenvalue-Gap Bounds for Complex
Gaussian Perturbations
- URL: http://arxiv.org/abs/2306.16648v1
- Date: Thu, 29 Jun 2023 03:18:53 GMT
- Title: Private Covariance Approximation and Eigenvalue-Gap Bounds for Complex
Gaussian Perturbations
- Authors: Oren Mangoubi, Nisheeth K. Vishnoi
- Abstract summary: We show that the Frobenius norm of the difference between the matrix output by this mechanism and the best rank-$k$ approximation to $M$ is bounded by roughly $tildeO(sqrtkd)$.
This improves on previous work that requires that the gap between every pair of top-$k$ eigenvalues of $M$ is at least $sqrtd$ for a similar bound.
- Score: 28.431572772564518
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the problem of approximating a $d \times d$ covariance matrix $M$
with a rank-$k$ matrix under $(\varepsilon,\delta)$-differential privacy. We
present and analyze a complex variant of the Gaussian mechanism and show that
the Frobenius norm of the difference between the matrix output by this
mechanism and the best rank-$k$ approximation to $M$ is bounded by roughly
$\tilde{O}(\sqrt{kd})$, whenever there is an appropriately large gap between
the $k$'th and the $k+1$'th eigenvalues of $M$. This improves on previous work
that requires that the gap between every pair of top-$k$ eigenvalues of $M$ is
at least $\sqrt{d}$ for a similar bound. Our analysis leverages the fact that
the eigenvalues of complex matrix Brownian motion repel more than in the real
case, and uses Dyson's stochastic differential equations governing the
evolution of its eigenvalues to show that the eigenvalues of the matrix $M$
perturbed by complex Gaussian noise have large gaps with high probability. Our
results contribute to the analysis of low-rank approximations under
average-case perturbations and to an understanding of eigenvalue gaps for
random matrices, which may be of independent interest.
Related papers
- Distribution of lowest eigenvalue in $k$-body bosonic random matrix ensembles [0.8999666725996978]
We numerically study the distribution of the lowest eigenvalue of finite many-boson systems with $k$-body interactions.
The first four moments of the distribution of lowest eigenvalues have been analyzed as a function of the $q$ parameter.
arXiv Detail & Related papers (2024-04-30T20:44:31Z) - General Gaussian Noise Mechanisms and Their Optimality for Unbiased Mean
Estimation [58.03500081540042]
A classical approach to private mean estimation is to compute the true mean and add unbiased, but possibly correlated, Gaussian noise to it.
We show that for every input dataset, an unbiased mean estimator satisfying concentrated differential privacy introduces approximately at least as much error.
arXiv Detail & Related papers (2023-01-31T18:47:42Z) - Re-Analyze Gauss: Bounds for Private Matrix Approximation via Dyson
Brownian Motion [28.431572772564518]
Given a symmetric matrix $M$ and a vector $lambda$, we present new bounds on the Frobenius-distance utility of the Gaussian mechanism for approximating $M$ by a matrix.
Our bounds depend on both $lambda$ and the gaps in the eigenvalues of $M$, and hold whenever the top $k+1$ eigenvalues of $M$ have sufficiently large gaps.
arXiv Detail & Related papers (2022-11-11T18:54:01Z) - Optimal Query Complexities for Dynamic Trace Estimation [59.032228008383484]
We consider the problem of minimizing the number of matrix-vector queries needed for accurate trace estimation in the dynamic setting where our underlying matrix is changing slowly.
We provide a novel binary tree summation procedure that simultaneously estimates all $m$ traces up to $epsilon$ error with $delta$ failure probability.
Our lower bounds (1) give the first tight bounds for Hutchinson's estimator in the matrix-vector product model with Frobenius norm error even in the static setting, and (2) are the first unconditional lower bounds for dynamic trace estimation.
arXiv Detail & Related papers (2022-09-30T04:15:44Z) - Near-optimal fitting of ellipsoids to random points [68.12685213894112]
A basic problem of fitting an ellipsoid to random points has connections to low-rank matrix decompositions, independent component analysis, and principal component analysis.
We resolve this conjecture up to logarithmic factors by constructing a fitting ellipsoid for some $n = Omega(, d2/mathrmpolylog(d),)$.
Our proof demonstrates feasibility of the least squares construction of Saunderson et al. using a convenient decomposition of a certain non-standard random matrix.
arXiv Detail & Related papers (2022-08-19T18:00:34Z) - Private Matrix Approximation and Geometry of Unitary Orbits [29.072423395363668]
This problem seeks to approximate $A$ by a matrix whose spectrum is the same as $Lambda$.
We give efficient and private algorithms that come with upper and lower bounds on the approximation error.
arXiv Detail & Related papers (2022-07-06T16:31:44Z) - Perturbation Analysis of Randomized SVD and its Applications to
High-dimensional Statistics [8.90202564665576]
We study the statistical properties of RSVD under a general "signal-plus-noise" framework.
We derive nearly-optimal performance guarantees for RSVD when applied to three statistical inference problems.
arXiv Detail & Related papers (2022-03-19T07:26:45Z) - Largest Eigenvalues of the Conjugate Kernel of Single-Layered Neural
Networks [0.0]
We show that the largest eigenvalue has the same limit (in probability) as that of some well-known linear random matrix ensembles.
This may be of interest for applications to machine learning.
arXiv Detail & Related papers (2022-01-13T00:48:20Z) - Spectral properties of sample covariance matrices arising from random
matrices with independent non identically distributed columns [50.053491972003656]
It was previously shown that the functionals $texttr(AR(z))$, for $R(z) = (frac1nXXT- zI_p)-1$ and $Ain mathcal M_p$ deterministic, have a standard deviation of order $O(|A|_* / sqrt n)$.
Here, we show that $|mathbb E[R(z)] - tilde R(z)|_F
arXiv Detail & Related papers (2021-09-06T14:21:43Z) - Sparse sketches with small inversion bias [79.77110958547695]
Inversion bias arises when averaging estimates of quantities that depend on the inverse covariance.
We develop a framework for analyzing inversion bias, based on our proposed concept of an $(epsilon,delta)$-unbiased estimator for random matrices.
We show that when the sketching matrix $S$ is dense and has i.i.d. sub-gaussian entries, the estimator $(epsilon,delta)$-unbiased for $(Atop A)-1$ with a sketch of size $m=O(d+sqrt d/
arXiv Detail & Related papers (2020-11-21T01:33:15Z) - Optimal Robust Linear Regression in Nearly Linear Time [97.11565882347772]
We study the problem of high-dimensional robust linear regression where a learner is given access to $n$ samples from the generative model $Y = langle X,w* rangle + epsilon$
We propose estimators for this problem under two settings: (i) $X$ is L4-L2 hypercontractive, $mathbbE [XXtop]$ has bounded condition number and $epsilon$ has bounded variance and (ii) $X$ is sub-Gaussian with identity second moment and $epsilon$ is
arXiv Detail & Related papers (2020-07-16T06:44:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.