Practical and Accurate Local Edge Differentially Private Graph Algorithms
- URL: http://arxiv.org/abs/2506.20828v2
- Date: Fri, 27 Jun 2025 03:23:30 GMT
- Title: Practical and Accurate Local Edge Differentially Private Graph Algorithms
- Authors: Pranay Mundra, Charalampos Papamanthou, Julian Shun, Quanquan C. Liu,
- Abstract summary: We introduce novel LDP algorithms for two fundamental graph statistics: k-core decomposition and triangle counting.<n>Our approach leverages input-dependent private graph properties, specifically the degeneracy and maximum degree of the graph, to improve theoretical utility.<n>Our k-core decomposition achieves errors within 3x of exact values, far outperforming the 131x error in the baseline of Dhulipala et al.
- Score: 11.593320678280824
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The rise of massive networks across diverse domains necessitates sophisticated graph analytics, often involving sensitive data and raising privacy concerns. This paper addresses these challenges using local differential privacy (LDP), which enforces privacy at the individual level, where no third-party entity is trusted, unlike centralized models that assume a trusted curator. We introduce novel LDP algorithms for two fundamental graph statistics: k-core decomposition and triangle counting. Our approach leverages input-dependent private graph properties, specifically the degeneracy and maximum degree of the graph, to improve theoretical utility. Unlike prior methods, our error bounds are determined by the maximum degree rather than the total number of edges, resulting in significantly tighter guarantees. For triangle counting, we improve upon the work of Imola, Murakami, and Chaudhury [USENIX Security `21, `22], which bounds error in terms of edge count. Instead, our algorithm achieves bounds based on graph degeneracy by leveraging a private out-degree orientation, a refined variant of Eden et al.'s randomized response technique [ICALP `23], and a novel analysis, yielding stronger guarantees than prior work. Beyond theoretical gains, we are the first to evaluate local DP algorithms in a distributed simulation, unlike prior work tested on a single processor. Experiments on real-world graphs show substantial accuracy gains: our k-core decomposition achieves errors within 3x of exact values, far outperforming the 131x error in the baseline of Dhulipala et al. [FOCS `22]. Our triangle counting algorithm reduces multiplicative approximation errors by up to six orders of magnitude, while maintaining competitive runtime.
Related papers
- Local Distance Query with Differential Privacy [37.18488513802541]
Distance, a critical factor in graph analysis, is typically handled using curator DP.<n>In many real-world scenarios, such a curator may not be present, posing a significant challenge for implementing differentially private distance queries under Local Differential Privacy (LDP)
arXiv Detail & Related papers (2025-08-07T15:48:35Z) - Benchmarking Fraud Detectors on Private Graph Data [70.4654745317714]
Currently, many types of fraud are managed in part by automated detection algorithms that operate over graphs.<n>We consider the scenario where a data holder wishes to outsource development of fraud detectors to third parties.<n>Third parties submit their fraud detectors to the data holder, who evaluates these algorithms on a private dataset and then publicly communicates the results.<n>We propose a realistic privacy attack on this system that allows an adversary to de-anonymize individuals' data based only on the evaluation results.
arXiv Detail & Related papers (2025-07-30T03:20:15Z) - Pruning Spurious Subgraphs for Graph Out-of-Distribtuion Generalization [90.32406785945223]
We propose PrunE, the first pruning-based graph OOD method that eliminates spurious edges to improve OOD generalizability.<n>PrunE employs two regularization terms to prune spurious edges: 1) graph size constraint to exclude uninformative spurious edges, and 2) $epsilon$-probability alignment to further suppress the occurrence of spurious edges.
arXiv Detail & Related papers (2025-06-06T10:34:48Z) - On the Price of Differential Privacy for Hierarchical Clustering [21.64775530337936]
We present a novel algorithm in the weight privacy model that shows significantly better approximation than known impossibility results in the edge-level DP setting.<n>Our results show that our algorithm performs well in terms of extra cost and has good scalability to large graphs.
arXiv Detail & Related papers (2025-04-22T04:39:40Z) - Graph-Dependent Regret Bounds in Multi-Armed Bandits with Interference [18.101059380671582]
We study multi-armed bandits under network interference.<n>This induces an exponentially large action space.<n>We propose a novel algorithm that uses the local graph structure to minimize regret.
arXiv Detail & Related papers (2025-03-10T17:25:24Z) - Sublinear Space Graph Algorithms in the Continual Release Model [48.65946341463292]
We make novel use of sparsification techniques from the non-private streaming and static algorithms literature to achieve new results in the sublinear space, continual release setting.<n>This includes algorithms for densest subgraph, maximum matching, and the first continual release $k$-core decomposition algorithm.
arXiv Detail & Related papers (2024-07-24T20:15:07Z) - Private Edge Density Estimation for Random Graphs: Optimal, Efficient and Robust [5.037313459134419]
We give the first-time, differentially node-private, and robust algorithm for estimating the edge density of ErdHos-R'enyi random graphs.
We prove information-theoretical lower bounds, showing that the error rate of our algorithm is optimal up to logarithmic factors.
arXiv Detail & Related papers (2024-05-26T18:59:44Z) - Differentially Private Domain Adaptation with Theoretical Guarantees [46.37771025567305]
In many applications, the labeled data at the labeled data's disposal is subject to privacy constraints and is relatively limited.
This is the modern problem of supervised domain adaptation from a public source to a private target domain.
We make use of a general learner to benefit from favorable theoretical learning guarantees.
arXiv Detail & Related papers (2023-06-15T04:03:06Z) - Normalized/Clipped SGD with Perturbation for Differentially Private
Non-Convex Optimization [94.06564567766475]
DP-SGD and DP-NSGD mitigate the risk of large models memorizing sensitive training data.
We show that these two algorithms achieve similar best accuracy while DP-NSGD is comparatively easier to tune than DP-SGD.
arXiv Detail & Related papers (2022-06-27T03:45:02Z) - Large-Scale Methods for Distributionally Robust Optimization [53.98643772533416]
We prove that our algorithms require a number of evaluations gradient independent of training set size and number of parameters.
Experiments on MNIST and ImageNet confirm the theoretical scaling of our algorithms, which are 9--36 times more efficient than full-batch methods.
arXiv Detail & Related papers (2020-10-12T17:41:44Z) - Private Stochastic Non-Convex Optimization: Adaptive Algorithms and
Tighter Generalization Bounds [72.63031036770425]
We propose differentially private (DP) algorithms for bound non-dimensional optimization.
We demonstrate two popular deep learning methods on the empirical advantages over standard gradient methods.
arXiv Detail & Related papers (2020-06-24T06:01:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.