Towards Sparse Federated Analytics: Location Heatmaps under Distributed
Differential Privacy with Secure Aggregation
- URL: http://arxiv.org/abs/2111.02356v1
- Date: Wed, 3 Nov 2021 17:19:05 GMT
- Title: Towards Sparse Federated Analytics: Location Heatmaps under Distributed
Differential Privacy with Secure Aggregation
- Authors: Eugene Bagdasaryan, Peter Kairouz, Stefan Mellem, Adri\`a Gasc\'on,
Kallista Bonawitz, Deborah Estrin and Marco Gruteser
- Abstract summary: We design a scalable algorithm to privately generate location heatmaps over decentralized data from millions of user devices.
It aims to ensure differential privacy before data becomes visible to a service provider while maintaining high data accuracy and minimizing resource consumption on users' devices.
- Score: 15.569382274788234
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We design a scalable algorithm to privately generate location heatmaps over
decentralized data from millions of user devices. It aims to ensure
differential privacy before data becomes visible to a service provider while
maintaining high data accuracy and minimizing resource consumption on users'
devices. To achieve this, we revisit the distributed differential privacy
concept based on recent results in the secure multiparty computation field and
design a scalable and adaptive distributed differential privacy approach for
location analytics. Evaluation on public location datasets shows that this
approach successfully generates metropolitan-scale heatmaps from millions of
user samples with a worst-case client communication overhead that is
significantly smaller than existing state-of-the-art private protocols of
similar accuracy.
Related papers
- Learning from End User Data with Shuffled Differential Privacy over Kernel Densities [10.797515094684318]
We study a setting of collecting and learning from private data distributed across end users.
In the shuffled model of differential privacy, the end users partially protect their data locally before sharing it, and their data is also anonymized during its collection to enhance privacy.
Our main technical result is a shuffled DP protocol for privately estimating the kernel density function of a distributed dataset.
arXiv Detail & Related papers (2025-02-19T20:27:01Z) - Characterizing the Accuracy-Communication-Privacy Trade-off in Distributed Stochastic Convex Optimization [30.45012237666837]
We consider the problem of differentially private convex optimization (DP-SCO) in a distributed setting with $M$ clients.
The objective is to design an algorithm to minimize a convex population loss using a collaborative effort across $M$ clients.
We establish matching converse and achievability results using a novel lower bound and a new algorithm for distributed DP-SCO.
arXiv Detail & Related papers (2025-01-06T18:57:05Z) - Enhanced Privacy Bound for Shuffle Model with Personalized Privacy [32.08637708405314]
Differential Privacy (DP) is an enhanced privacy protocol which introduces an intermediate trusted server between local users and a central data curator.
It significantly amplifies the central DP guarantee by anonymizing and shuffling the local randomized data.
This work focuses on deriving the central privacy bound for a more practical setting where personalized local privacy is required by each user.
arXiv Detail & Related papers (2024-07-25T16:11:56Z) - PeFAD: A Parameter-Efficient Federated Framework for Time Series Anomaly Detection [51.20479454379662]
We propose a.
Federated Anomaly Detection framework named PeFAD with the increasing privacy concerns.
We conduct extensive evaluations on four real datasets, where PeFAD outperforms existing state-of-the-art baselines by up to 28.74%.
arXiv Detail & Related papers (2024-06-04T13:51:08Z) - Measuring Privacy Loss in Distributed Spatio-Temporal Data [26.891854386652266]
We propose an alternative privacy loss against location reconstruction attacks by an informed adversary.
Our experiments on real and synthetic data demonstrate that our privacy loss better reflects our intuitions on individual privacy violation in the distributed setting.
arXiv Detail & Related papers (2024-02-18T09:53:14Z) - Data Analytics with Differential Privacy [0.0]
We develop differentially private algorithms to analyze distributed and streaming data.
In the distributed model, we consider the particular problem of learning -- in a distributed fashion -- a global model of the data.
We offer one of the strongest privacy guarantees for the streaming model, user-level pan-privacy.
arXiv Detail & Related papers (2023-07-20T17:43:29Z) - Private Set Generation with Discriminative Information [63.851085173614]
Differentially private data generation is a promising solution to the data privacy challenge.
Existing private generative models are struggling with the utility of synthetic samples.
We introduce a simple yet effective method that greatly improves the sample utility of state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-07T10:02:55Z) - Smooth Anonymity for Sparse Graphs [69.1048938123063]
differential privacy has emerged as the gold standard of privacy, however, when it comes to sharing sparse datasets.
In this work, we consider a variation of $k$-anonymity, which we call smooth-$k$-anonymity, and design simple large-scale algorithms that efficiently provide smooth-$k$-anonymity.
arXiv Detail & Related papers (2022-07-13T17:09:25Z) - Decentralized Stochastic Optimization with Inherent Privacy Protection [103.62463469366557]
Decentralized optimization is the basic building block of modern collaborative machine learning, distributed estimation and control, and large-scale sensing.
Since involved data, privacy protection has become an increasingly pressing need in the implementation of decentralized optimization algorithms.
arXiv Detail & Related papers (2022-05-08T14:38:23Z) - Mixed Differential Privacy in Computer Vision [133.68363478737058]
AdaMix is an adaptive differentially private algorithm for training deep neural network classifiers using both private and public image data.
A few-shot or even zero-shot learning baseline that ignores private data can outperform fine-tuning on a large private dataset.
arXiv Detail & Related papers (2022-03-22T06:15:43Z) - Graph-Homomorphic Perturbations for Private Decentralized Learning [64.26238893241322]
Local exchange of estimates allows inference of data based on private data.
perturbations chosen independently at every agent, resulting in a significant performance loss.
We propose an alternative scheme, which constructs perturbations according to a particular nullspace condition, allowing them to be invisible.
arXiv Detail & Related papers (2020-10-23T10:35:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.