Differentially Private Gradient Flow based on the Sliced Wasserstein
Distance for Non-Parametric Generative Modeling
- URL: http://arxiv.org/abs/2312.08227v1
- Date: Wed, 13 Dec 2023 15:47:30 GMT
- Title: Differentially Private Gradient Flow based on the Sliced Wasserstein
Distance for Non-Parametric Generative Modeling
- Authors: Ilana Sebag, Muni Sreenivas PYDI, Jean-Yves Franceschi, Alain
Rakotomamonjy, Mike Gartrell, Jamal Atif, Alexandre Allauzen
- Abstract summary: We introduce a novel differentially private generative modeling approach based on parameter-free gradient flows in the space of probability measures.
Our experiments show that compared to a generator-based model, our proposed model can generate higher-fidelity data at a low privacy budget.
- Score: 61.65137699747604
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Safeguarding privacy in sensitive training data is paramount, particularly in
the context of generative modeling. This is done through either differentially
private stochastic gradient descent, or with a differentially private metric
for training models or generators. In this paper, we introduce a novel
differentially private generative modeling approach based on parameter-free
gradient flows in the space of probability measures. The proposed algorithm is
a new discretized flow which operates through a particle scheme, utilizing
drift derived from the sliced Wasserstein distance and computed in a private
manner. Our experiments show that compared to a generator-based model, our
proposed model can generate higher-fidelity data at a low privacy budget,
offering a viable alternative to generator-based approaches.
Related papers
- Discrete Flow Matching [74.04153927689313]
We present a novel discrete flow paradigm designed specifically for generating discrete data.
Our approach is capable of generating high-quality discrete data in a non-autoregressive fashion.
arXiv Detail & Related papers (2024-07-22T12:33:27Z) - On the Computational Complexity of Private High-dimensional Model Selection [18.964255744068122]
We consider the problem of model selection in a high-dimensional sparse linear regression under privacy constraints.
We propose a differentially private best subset selection method with strong utility properties by adopting a well-known exponential model.
arXiv Detail & Related papers (2023-10-11T19:53:15Z) - Differentially Private Statistical Inference through $\beta$-Divergence
One Posterior Sampling [2.8544822698499255]
We propose a posterior sampling scheme from a generalised posterior targeting the minimisation of the $beta$-divergence between the model and the data generating process.
This provides private estimation that is generally applicable without requiring changes to the underlying model.
We show that $beta$D-Bayes produces more precise inference estimation for the same privacy guarantees.
arXiv Detail & Related papers (2023-07-11T12:00:15Z) - Differentiating Metropolis-Hastings to Optimize Intractable Densities [51.16801956665228]
We develop an algorithm for automatic differentiation of Metropolis-Hastings samplers.
We apply gradient-based optimization to objectives expressed as expectations over intractable target densities.
arXiv Detail & Related papers (2023-06-13T17:56:02Z) - Learning Differentially Private Probabilistic Models for
Privacy-Preserving Image Generation [67.47979276739144]
We propose learning differentially private probabilistic models to generate high-resolution images with differential privacy guarantee.
Our approach can generate images up to 256x256 with remarkable visual quality and data utility.
arXiv Detail & Related papers (2023-05-18T02:51:17Z) - Network Generation with Differential Privacy [4.297070083645049]
We consider the problem of generating private synthetic versions of real-world graphs containing private information.
We propose a generative model that can reproduce the properties of real-world networks while maintaining edge-differential privacy.
arXiv Detail & Related papers (2021-11-17T13:07:09Z) - Don't Generate Me: Training Differentially Private Generative Models
with Sinkhorn Divergence [73.14373832423156]
We propose DP-Sinkhorn, a novel optimal transport-based generative method for learning data distributions from private data with differential privacy.
Unlike existing approaches for training differentially private generative models, we do not rely on adversarial objectives.
arXiv Detail & Related papers (2021-11-01T18:10:21Z) - High-Dimensional Differentially-Private EM Algorithm: Methods and
Near-Optimal Statistical Guarantees [8.089708900273804]
We develop a general framework to design differentially private expectation-maximization (EM) algorithms in high-dimensional latent variable models.
In each model, we establish the near-optimal rate of convergence with differential privacy constraints.
We propose a near rate-optimal EM algorithm with differential privacy guarantees in this setting.
arXiv Detail & Related papers (2021-04-01T04:08:34Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.