Differentially private Riemannian optimization
- URL: http://arxiv.org/abs/2205.09494v1
- Date: Thu, 19 May 2022 12:04:15 GMT
- Title: Differentially private Riemannian optimization
- Authors: Andi Han, Bamdev Mishra, Pratik Jawanpuria, Junbin Gao
- Abstract summary: We introduce a framework of differentially private.
Ritual risk problem where the.
parameter is constrained to a.
Rockefellerian manifold.
We show the efficacy of the proposed framework in several applications.
- Score: 40.23168342389821
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we study the differentially private empirical risk
minimization problem where the parameter is constrained to a Riemannian
manifold. We introduce a framework of differentially private Riemannian
optimization by adding noise to the Riemannian gradient on the tangent space.
The noise follows a Gaussian distribution intrinsically defined with respect to
the Riemannian metric. We adapt the Gaussian mechanism from the Euclidean space
to the tangent space compatible to such generalized Gaussian distribution. We
show that this strategy presents a simple analysis as compared to directly
adding noise on the manifold. We further show privacy guarantees of the
proposed differentially private Riemannian (stochastic) gradient descent using
an extension of the moments accountant technique. Additionally, we prove
utility guarantees under geodesic (strongly) convex, general nonconvex
objectives as well as under the Riemannian Polyak-{\L}ojasiewicz condition. We
show the efficacy of the proposed framework in several applications.
Related papers
- Streamlining in the Riemannian Realm: Efficient Riemannian Optimization
with Loopless Variance Reduction [4.578425862931332]
This study focuses on the crucial reduction mechanism used in both Euclidean and Riemannian settings.
Motivated by Euclidean methods, we introduce R-based methods to replace the outer loop with computation triggered by a coin flip.
Using R- as a framework, we demonstrate its applicability to various important settings.
arXiv Detail & Related papers (2024-03-11T12:49:37Z) - Gaussian Differential Privacy on Riemannian Manifolds [10.296090438643132]
This work marks the first instance of extending the GDP framework to accommodate general Riemannian manifold.
We provide a simple algorithm to evaluate the privacy budget $mu$ on any one-dimensional manifold.
We also introduce a versatile Markov Chain Monte Carlo (MCMC)-based algorithm to calculate $mu$ on any Riemannian manifold with constant curvature.
arXiv Detail & Related papers (2023-11-09T04:46:27Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Riemannian stochastic optimization methods avoid strict saddle points [68.80251170757647]
We show that policies under study avoid strict saddle points / submanifolds with probability 1.
This result provides an important sanity check as it shows that, almost always, the limit state of an algorithm can only be a local minimizer.
arXiv Detail & Related papers (2023-11-04T11:12:24Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - A Riemannian smoothing steepest descent method for non-Lipschitz
optimization on submanifolds [15.994495285950293]
A smoothing steepest descent method is proposed to minimize a non-Lipschitz function on submanifolds.
We prove any point of the sequence generated by the smoothing function is a limiting point of the original non-Lipschitz problem.
Numerical experiments are conducted to demonstrate the efficiency of the proposed method.
arXiv Detail & Related papers (2021-04-09T05:38:28Z) - Stochastic Zeroth-order Riemannian Derivative Estimation and
Optimization [15.78743548731191]
We propose an oracle version of the Gaussian smoothing function to overcome the difficulty of non-linearity of manifold non-linearity.
We demonstrate the applicability of our algorithms by results and real-world applications on black-box stiffness control for robotics and black-box attacks to neural networks.
arXiv Detail & Related papers (2020-03-25T06:58:19Z) - From Nesterov's Estimate Sequence to Riemannian Acceleration [52.99237600875452]
We develop an alternative analysis for Nesterov's estimate sequence technique that may also be of independent interest.
Then, we extend this analysis to the Riemannian setting, localizing the key difficulty due to non-Euclidean structure into a certain metric distortion''
arXiv Detail & Related papers (2020-01-24T04:17:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.