Conformal-DP: Differential Privacy on Riemannian Manifolds via Conformal Transformation
- URL: http://arxiv.org/abs/2504.20941v1
- Date: Tue, 29 Apr 2025 17:05:55 GMT
- Title: Conformal-DP: Differential Privacy on Riemannian Manifolds via Conformal Transformation
- Authors: Peilin He, Liou Tang, M. Amin Rahimian, James Joshi,
- Abstract summary: Differential Privacy (DP) has been established as a safeguard for private data sharing by adding perturbations to information release.<n>We propose a novel mechanism: Conformal-DP, utilizing conformal transformations on the Riemannian manifold to equalize local sample density.<n>Our experimental results validate that the mechanism achieves high utility while providing the $ varepsilon $-DP guarantee for both homogeneous and especially heterogeneous manifold data.
- Score: 0.6981884305287337
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Differential Privacy (DP) has been established as a safeguard for private data sharing by adding perturbations to information release. Prior research on DP has extended beyond data in the flat Euclidean space and addressed data on curved manifolds, e.g., diffusion tensor MRI, social networks, or organ shape analysis, by adding perturbations along geodesic distances. However, existing manifold-aware DP methods rely on the assumption that samples are uniformly distributed across the manifold. In reality, data densities vary, leading to a biased noise imbalance across manifold regions, weakening the privacy-utility trade-offs. To address this gap, we propose a novel mechanism: Conformal-DP, utilizing conformal transformations on the Riemannian manifold to equalize local sample density and to redefine geodesic distances accordingly while preserving the intrinsic geometry of the manifold. Our theoretical analysis yields two main results. First, we prove that the conformal factor computed from local kernel-density estimates is explicitly data-density-aware; Second, under the conformal metric, the mechanism satisfies $ \varepsilon $-differential privacy on any complete Riemannian manifold and admits a closed-form upper bound on the expected geodesic error that depends only on the maximal density ratio, not on global curvatureof the manifold. Our experimental results validate that the mechanism achieves high utility while providing the $ \varepsilon $-DP guarantee for both homogeneous and especially heterogeneous manifold data.
Related papers
- Learning over von Mises-Fisher Distributions via a Wasserstein-like Geometry [0.0]
We introduce a geometry-aware distance metric for the family of von Mises-Fisher (vMF) distributions.<n>Motivated by the theory of optimal transport, we propose a Wasserstein-like distance that decomposes the discrepancy between two vMF distributions into two interpretable components.
arXiv Detail & Related papers (2025-04-19T03:38:15Z) - Finsler Multi-Dimensional Scaling: Manifold Learning for Asymmetric Dimensionality Reduction and Embedding [41.601022263772535]
Dimensionality reduction aims to simplify complex data by reducing its feature dimensionality while preserving essential patterns, with core applications in data analysis and visualisation.<n>To preserve the underlying data structure, multi-dimensional scaling (MDS) methods focus on preserving pairwise dissimilarities, such as distances.
arXiv Detail & Related papers (2025-03-23T10:03:22Z) - Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Sampling and estimation on manifolds using the Langevin diffusion [45.57801520690309]
Two estimators of linear functionals of $mu_phi $ based on the discretized Markov process are considered.<n>Error bounds are derived for sampling and estimation using a discretization of an intrinsically defined Langevin diffusion.
arXiv Detail & Related papers (2023-12-22T18:01:11Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - General Gaussian Noise Mechanisms and Their Optimality for Unbiased Mean
Estimation [58.03500081540042]
A classical approach to private mean estimation is to compute the true mean and add unbiased, but possibly correlated, Gaussian noise to it.
We show that for every input dataset, an unbiased mean estimator satisfying concentrated differential privacy introduces approximately at least as much error.
arXiv Detail & Related papers (2023-01-31T18:47:42Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Combating Mode Collapse in GANs via Manifold Entropy Estimation [70.06639443446545]
Generative Adversarial Networks (GANs) have shown compelling results in various tasks and applications.
We propose a novel training pipeline to address the mode collapse issue of GANs.
arXiv Detail & Related papers (2022-08-25T12:33:31Z) - Intrinsic dimension estimation for discrete metrics [65.5438227932088]
In this letter we introduce an algorithm to infer the intrinsic dimension (ID) of datasets embedded in discrete spaces.
We demonstrate its accuracy on benchmark datasets, and we apply it to analyze a metagenomic dataset for species fingerprinting.
This suggests that evolutive pressure acts on a low-dimensional manifold despite the high-dimensionality of sequences' space.
arXiv Detail & Related papers (2022-07-20T06:38:36Z) - Cycle Consistent Probability Divergences Across Different Spaces [38.43511529063335]
Discrepancy measures between probability distributions are at the core of statistical inference and machine learning.
This work proposes a novel unbalanced Monge optimal transport formulation for matching, up to isometries, distributions on different spaces.
arXiv Detail & Related papers (2021-11-22T16:35:58Z) - Uniform Interpolation Constrained Geodesic Learning on Data Manifold [28.509561636926414]
Along the learned geodesic, our method can generate high-qualitys between two given data samples.
We provide a theoretical analysis of our model and use image translation as an example to demonstrate the effectiveness of our method.
arXiv Detail & Related papers (2020-02-12T07:47:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.