d3p -- A Python Package for Differentially-Private Probabilistic
Programming
- URL: http://arxiv.org/abs/2103.11648v1
- Date: Mon, 22 Mar 2021 08:15:58 GMT
- Title: d3p -- A Python Package for Differentially-Private Probabilistic
Programming
- Authors: Lukas Prediger, Niki Loppi, Samuel Kaski, Antti Honkela
- Abstract summary: We present d3p, a software package designed to help fielding runtime efficient Bayesian inference under differential privacy guarantees.
d3p achieves general applicability to a wide range of probabilistic modelling problems by implementing the differentially private variational inference algorithm.
- Score: 16.91407609966877
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We present d3p, a software package designed to help fielding runtime
efficient widely-applicable Bayesian inference under differential privacy
guarantees. d3p achieves general applicability to a wide range of probabilistic
modelling problems by implementing the differentially private variational
inference algorithm, allowing users to fit any parametric probabilistic model
with a differentiable density function. d3p adopts the probabilistic
programming paradigm as a powerful way for the user to flexibly define such
models. We demonstrate the use of our software on a hierarchical logistic
regression example, showing the expressiveness of the modelling approach as
well as the ease of running the parameter inference. We also perform an
empirical evaluation of the runtime of the private inference on a complex model
and find an $\sim$10 fold speed-up compared to an implementation using
TensorFlow Privacy.
Related papers
- Discrete Flow Matching [74.04153927689313]
We present a novel discrete flow paradigm designed specifically for generating discrete data.
Our approach is capable of generating high-quality discrete data in a non-autoregressive fashion.
arXiv Detail & Related papers (2024-07-22T12:33:27Z) - Training Survival Models using Scoring Rules [9.330089124239086]
Survival Analysis provides critical insights for incomplete time-to-event data.
It is also an important example of probabilistic machine learning.
We establish different parametric and non-parametric sub-frameworks that allow different degrees of flexibility.
We show that using our framework, we can recover various parametric models and demonstrate that optimization works equally well when compared to likelihood-based methods.
arXiv Detail & Related papers (2024-03-19T20:58:38Z) - Efficient Incremental Belief Updates Using Weighted Virtual Observations [2.7195102129095003]
We present an algorithmic solution to the problem of incremental belief updating in the context of Monte Carlo inference.
We implement and apply the solution to a number of didactic examples and case studies, showing efficiency and robustness of our approach.
arXiv Detail & Related papers (2024-02-10T12:48:49Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - Scaling Structured Inference with Randomization [64.18063627155128]
We propose a family of dynamic programming (RDP) randomized for scaling structured models to tens of thousands of latent states.
Our method is widely applicable to classical DP-based inference.
It is also compatible with automatic differentiation so can be integrated with neural networks seamlessly.
arXiv Detail & Related papers (2021-12-07T11:26:41Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Probabilistic Modeling for Human Mesh Recovery [73.11532990173441]
This paper focuses on the problem of 3D human reconstruction from 2D evidence.
We recast the problem as learning a mapping from the input to a distribution of plausible 3D poses.
arXiv Detail & Related papers (2021-08-26T17:55:11Z) - Conditional independence by typing [30.194205448457385]
A central goal of probabilistic programming languages (PPLs) is to separate modelling from inference.
Conditional independence (CI) relationships among parameters are a crucial aspect of probabilistic models.
We show that for a well-typed program in our system, the distribution it implements is guaranteed to have certain CI-relationships.
arXiv Detail & Related papers (2020-10-22T17:27:22Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Misspecification-robust likelihood-free inference in high dimensions [9.514562526751481]
We introduce an extension of the popular Bayesian optimisation based approach to approximate discrepancy functions in a probabilistic manner.
Our approach achieves computational scalability for higher dimensional parameter spaces by using separate acquisition functions and discrepancies for each parameter.
The method successfully performs computationally efficient inference in a 100-dimensional space on canonical examples and compares favourably to existing modularised ABC methods.
arXiv Detail & Related papers (2020-02-21T16:06:11Z) - Automatic structured variational inference [12.557212589634112]
We introduce automatic structured variational inference (ASVI)
ASVI is a fully automated method for constructing structured variational families.
We find that ASVI provides a clear improvement in performance when compared with other popular approaches.
arXiv Detail & Related papers (2020-02-03T10:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.