Continual Release of Densest Subgraphs: Privacy Amplification & Sublinear Space via Subsampling
- URL: http://arxiv.org/abs/2510.11640v1
- Date: Mon, 13 Oct 2025 17:20:13 GMT
- Title: Continual Release of Densest Subgraphs: Privacy Amplification & Sublinear Space via Subsampling
- Authors: Felix Zhou,
- Abstract summary: We study the sublinear space continual release model for edge-differentially private (DP) graph algorithms.<n>Our main result is the first continual release DSG algorithm that matches the additive error of the best static DP algorithms.<n>We introduce graph densification in the graph DP setting, adding edges to trigger earlier subsampling, which removes the extra logarithmic factors in error and space incurred by prior work.
- Score: 2.0607844057825324
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the sublinear space continual release model for edge-differentially private (DP) graph algorithms, with a focus on the densest subgraph problem (DSG) in the insertion-only setting. Our main result is the first continual release DSG algorithm that matches the additive error of the best static DP algorithms and the space complexity of the best non-private streaming algorithms, up to constants. The key idea is a refined use of subsampling that simultaneously achieves privacy amplification and sparsification, a connection not previously formalized in graph DP. Via a simple black-box reduction to the static setting, we obtain both pure and approximate-DP algorithms with $O(\log n)$ additive error and $O(n\log n)$ space, improving both accuracy and space complexity over the previous state of the art. Along the way, we introduce graph densification in the graph DP setting, adding edges to trigger earlier subsampling, which removes the extra logarithmic factors in error and space incurred by prior work [ELMZ25]. We believe this simple idea may be of independent interest.
Related papers
- Spectral Graph Clustering under Differential Privacy: Balancing Privacy, Accuracy, and Efficiency [53.98433419539793]
We study the problem of spectral graph clustering under edge differential privacy (DP)<n>Specifically, we develop three mechanisms: (i) graph perturbation via randomized edge flipping combined with adjacency matrix shuffling, which enforces edge privacy; (ii) private graph projection with additive Gaussian noise in a lower-dimensional space to reduce dimensionality and computational complexity; and (iii) a noisy power iteration method that distributes Gaussian noise across iterations to ensure edge DP while maintaining convergence.
arXiv Detail & Related papers (2025-10-08T15:30:27Z) - On the Price of Differential Privacy for Hierarchical Clustering [21.64775530337936]
We present a novel algorithm in the weight privacy model that shows significantly better approximation than known impossibility results in the edge-level DP setting.<n>Our results show that our algorithm performs well in terms of extra cost and has good scalability to large graphs.
arXiv Detail & Related papers (2025-04-22T04:39:40Z) - Differentially Private Bilevel Optimization [4.07926531936425]
We present differentially private (DPright) algorithms for bilevel optimization, a problem class that received significant attention lately in various machine learning applications.<n>Our analysis covers constrained and unstudied problems alike, accounts for mini-batchs, and population losses.
arXiv Detail & Related papers (2024-09-29T21:52:38Z) - Sublinear Space Graph Algorithms in the Continual Release Model [48.65946341463292]
We make novel use of sparsification techniques from the non-private streaming and static algorithms literature to achieve new results in the sublinear space, continual release setting.<n>This includes algorithms for densest subgraph, maximum matching, and the first continual release $k$-core decomposition algorithm.
arXiv Detail & Related papers (2024-07-24T20:15:07Z) - Optimal Rates for $O(1)$-Smooth DP-SCO with a Single Epoch and Large Batches [12.184984662899868]
We revisit the correlated convex optimization (SCO) problem.
We propose an algorithm that achieves the optimal rate for DP-SCO (up to polylog factors) in a single epoch.
arXiv Detail & Related papers (2024-06-04T18:59:42Z) - Anonymized Histograms in Intermediate Privacy Models [54.32252900997422]
We provide an algorithm with a nearly matching error guarantee of $tildeO_varepsilon(sqrtn)$ in the shuffle DP and pan-private models.
Our algorithm is very simple: it just post-processes the discrete Laplace-noised histogram.
arXiv Detail & Related papers (2022-10-27T05:11:00Z) - Private Online Prediction from Experts: Separations and Faster Rates [74.52487417350221]
Online prediction from experts is a fundamental problem in machine learning and several works have studied this problem under privacy constraints.
We propose and analyze new algorithms for this problem that improve over the regret bounds of the best existing algorithms for non-adaptive adversaries.
arXiv Detail & Related papers (2022-10-24T18:40:19Z) - Normalized/Clipped SGD with Perturbation for Differentially Private
Non-Convex Optimization [94.06564567766475]
DP-SGD and DP-NSGD mitigate the risk of large models memorizing sensitive training data.
We show that these two algorithms achieve similar best accuracy while DP-NSGD is comparatively easier to tune than DP-SGD.
arXiv Detail & Related papers (2022-06-27T03:45:02Z) - Private Stochastic Non-Convex Optimization: Adaptive Algorithms and
Tighter Generalization Bounds [72.63031036770425]
We propose differentially private (DP) algorithms for bound non-dimensional optimization.
We demonstrate two popular deep learning methods on the empirical advantages over standard gradient methods.
arXiv Detail & Related papers (2020-06-24T06:01:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.