Local Graph-homomorphic Processing for Privatized Distributed Systems
- URL: http://arxiv.org/abs/2210.15414v1
- Date: Wed, 26 Oct 2022 10:00:14 GMT
- Title: Local Graph-homomorphic Processing for Privatized Distributed Systems
- Authors: Elsa Rizk, Stefan Vlaski, Ali H. Sayed
- Abstract summary: We show that the added noise does not affect the performance of the learned model.
This is a significant improvement to previous works on differential privacy for distributed algorithms.
- Score: 57.14673504239551
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the generation of dependent random numbers in a distributed fashion
in order to enable privatized distributed learning by networked agents. We
propose a method that we refer to as local graph-homomorphic processing; it
relies on the construction of particular noises over the edges to ensure a
certain level of differential privacy. We show that the added noise does not
affect the performance of the learned model. This is a significant improvement
to previous works on differential privacy for distributed algorithms, where the
noise was added in a less structured manner without respecting the graph
topology and has often led to performance deterioration. We illustrate the
theoretical results by considering a linear regression problem over a network
of agents.
Related papers
- Robust Domain Generalisation with Causal Invariant Bayesian Neural Networks [9.999199798941424]
We propose a Bayesian neural architecture that disentangles the learning of the the data distribution from the inference process mechanisms.
We show theoretically and experimentally that our model approximates reasoning under causal interventions.
arXiv Detail & Related papers (2024-10-08T20:38:05Z) - Edge-preserving noise for diffusion models [4.435514696080208]
We present a novel edge-preserving diffusion model that is a generalization of denoising diffusion probablistic models (DDPM)
In particular, we introduce an edge-aware noise scheduler that varies between edge-preserving and isotropic Gaussian noise.
We show that our model's generative process converges faster to results that more closely match the target distribution.
arXiv Detail & Related papers (2024-10-02T13:29:52Z) - Learning Curves for Noisy Heterogeneous Feature-Subsampled Ridge
Ensembles [34.32021888691789]
We develop a theory of feature-bagging in noisy least-squares ridge ensembles.
We demonstrate that subsampling shifts the double-descent peak of a linear predictor.
We compare the performance of a feature-subsampling ensemble to a single linear predictor.
arXiv Detail & Related papers (2023-07-06T17:56:06Z) - Strategic Distribution Shift of Interacting Agents via Coupled Gradient
Flows [6.064702468344376]
We propose a novel framework for analyzing the dynamics of distribution shift in real-world systems.
We show that our approach captures well-documented forms of distribution shifts like polarization and disparate impacts that simpler models cannot capture.
arXiv Detail & Related papers (2023-07-03T17:18:50Z) - Learning Linear Causal Representations from Interventions under General
Nonlinear Mixing [52.66151568785088]
We prove strong identifiability results given unknown single-node interventions without access to the intervention targets.
This is the first instance of causal identifiability from non-paired interventions for deep neural network embeddings.
arXiv Detail & Related papers (2023-06-04T02:32:12Z) - Enforcing Privacy in Distributed Learning with Performance Guarantees [57.14673504239551]
We study the privatization of distributed learning and optimization strategies.
We show that the popular additive random perturbation scheme degrades performance because it is not well-tuned to the graph structure.
arXiv Detail & Related papers (2023-01-16T13:03:27Z) - DiGress: Discrete Denoising diffusion for graph generation [79.13904438217592]
DiGress is a discrete denoising diffusion model for generating graphs with categorical node and edge attributes.
It achieves state-of-the-art performance on molecular and non-molecular datasets, with up to 3x validity improvement.
It is also the first model to scale to the large GuacaMol dataset containing 1.3M drug-like molecules.
arXiv Detail & Related papers (2022-09-29T12:55:03Z) - Invariant Causal Mechanisms through Distribution Matching [86.07327840293894]
In this work we provide a causal perspective and a new algorithm for learning invariant representations.
Empirically we show that this algorithm works well on a diverse set of tasks and in particular we observe state-of-the-art performance on domain generalization.
arXiv Detail & Related papers (2022-06-23T12:06:54Z) - Scaling Structured Inference with Randomization [64.18063627155128]
We propose a family of dynamic programming (RDP) randomized for scaling structured models to tens of thousands of latent states.
Our method is widely applicable to classical DP-based inference.
It is also compatible with automatic differentiation so can be integrated with neural networks seamlessly.
arXiv Detail & Related papers (2021-12-07T11:26:41Z) - Learning Node Representations from Noisy Graph Structures [38.32421350245066]
Noises prevail in real-world networks, which compromise networks to a large extent.
We propose a novel framework to learn noise-free node representations and eliminate noises simultaneously.
arXiv Detail & Related papers (2020-12-04T07:18:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.