Private graphon estimation via sum-of-squares
- URL: http://arxiv.org/abs/2403.12213v2
- Date: Thu, 18 Apr 2024 17:35:16 GMT
- Title: Private graphon estimation via sum-of-squares
- Authors: Hongjie Chen, Jingqiu Ding, Tommaso d'Orsi, Yiding Hua, Chih-Hung Liu, David Steurer,
- Abstract summary: We develop the first pure node-differentially private algorithms for learning block models and for graphon estimation with constant running time for any number of blocks.
The statistical utility guarantees match those of the previous best information-theoretic (expon-time) node-private mechanisms for these problems.
- Score: 10.00024942014117
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop the first pure node-differentially-private algorithms for learning stochastic block models and for graphon estimation with polynomial running time for any constant number of blocks. The statistical utility guarantees match those of the previous best information-theoretic (exponential-time) node-private mechanisms for these problems. The algorithm is based on an exponential mechanism for a score function defined in terms of a sum-of-squares relaxation whose level depends on the number of blocks. The key ingredients of our results are (1) a characterization of the distance between the block graphons in terms of a quadratic optimization over the polytope of doubly stochastic matrices, (2) a general sum-of-squares convergence result for polynomial optimization over arbitrary polytopes, and (3) a general approach to perform Lipschitz extensions of score functions as part of the sum-of-squares algorithmic paradigm.
Related papers
- Private Edge Density Estimation for Random Graphs: Optimal, Efficient and Robust [5.037313459134419]
We give the first-time, differentially node-private, and robust algorithm for estimating the edge density of ErdHos-R'enyi random graphs.
We prove information-theoretical lower bounds, showing that the error rate of our algorithm is optimal up to logarithmic factors.
arXiv Detail & Related papers (2024-05-26T18:59:44Z) - Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and
Optimal Algorithms [64.10576998630981]
We show the first tight characterization of the optimal Hessian-dependent sample complexity.
A Hessian-independent algorithm universally achieves the optimal sample complexities for all Hessian instances.
The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions.
arXiv Detail & Related papers (2023-06-21T17:03:22Z) - Differentially-Private Hierarchical Clustering with Provable
Approximation Guarantees [79.59010418610625]
We study differentially private approximation algorithms for hierarchical clustering.
We show strong lower bounds for the problem: that any $epsilon$-DP algorithm must exhibit $O(|V|2/ epsilon)$-additive error for an input dataset.
We propose a private $1+o(1)$ approximation algorithm which also recovers the blocks exactly.
arXiv Detail & Related papers (2023-01-31T19:14:30Z) - Polynomial-Time Approximation of Zero-Free Partition Functions [9.153066211741356]
We give an algorithm-time algorithm for estimating classical and quantum partition functions specified by local Hamiltonians.
Our result is based on a new abstract framework that extends and generalizes the approach of Patel and Regts.
arXiv Detail & Related papers (2022-01-30T09:54:45Z) - Estimating leverage scores via rank revealing methods and randomization [50.591267188664666]
We study algorithms for estimating the statistical leverage scores of rectangular dense or sparse matrices of arbitrary rank.
Our approach is based on combining rank revealing methods with compositions of dense and sparse randomized dimensionality reduction transforms.
arXiv Detail & Related papers (2021-05-23T19:21:55Z) - A Bayesian Approach to Block-Term Tensor Decomposition Model Selection
and Computation [10.91885508254207]
The so-called block-term decomposition (BTD) tensor model, especially in its rank-$(L_r,L_r,1)$ version, has been recently receiving increasing attention.
The challenge of estimating the BTD model structure, namely the number of block terms and their individual ranks, has only recently started to attract significant attention.
A Bayesian approach is taken to addressing the problem of rank-$(L_r,L_r,1)$ BTD model selection and computation, based on the idea of imposing column sparsity emphjointly on the
arXiv Detail & Related papers (2021-01-08T09:37:21Z) - Constructing Segmented Differentiable Quadratics to Determine
Algorithmic Run Times and Model Non-Polynomial Functions [0.0]
We propose an approach to determine the continual progression of algorithmic efficiency.
The proposed method can effectively determine the run time behavior $F$ at any given index $x$.
arXiv Detail & Related papers (2020-12-03T00:22:49Z) - Slice Sampling for General Completely Random Measures [74.24975039689893]
We present a novel Markov chain Monte Carlo algorithm for posterior inference that adaptively sets the truncation level using auxiliary slice variables.
The efficacy of the proposed algorithm is evaluated on several popular nonparametric models.
arXiv Detail & Related papers (2020-06-24T17:53:53Z) - SURF: A Simple, Universal, Robust, Fast Distribution Learning Algorithm [64.13217062232874]
SURF is an algorithm for approximating distributions by piecewises.
It outperforms state-of-the-art algorithms in experiments.
arXiv Detail & Related papers (2020-02-22T01:03:33Z) - Optimal Randomized First-Order Methods for Least-Squares Problems [56.05635751529922]
This class of algorithms encompasses several randomized methods among the fastest solvers for least-squares problems.
We focus on two classical embeddings, namely, Gaussian projections and subsampled Hadamard transforms.
Our resulting algorithm yields the best complexity known for solving least-squares problems with no condition number dependence.
arXiv Detail & Related papers (2020-02-21T17:45:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.