Differentially Private Sliced Inverse Regression: Minimax Optimality and
Algorithm
- URL: http://arxiv.org/abs/2401.08150v1
- Date: Tue, 16 Jan 2024 06:47:43 GMT
- Title: Differentially Private Sliced Inverse Regression: Minimax Optimality and
Algorithm
- Authors: Xintao Xia, Linjun Zhang, Zhanrui Cai
- Abstract summary: We propose optimally differentially private algorithms designed to address privacy concerns in the context of sufficient dimension reduction.
We develop differentially private algorithms that achieve the minimax lower bounds up to logarithmic factors.
As a natural extension, we can readily offer analogous lower and upper bounds for differentially private sparse principal component analysis.
- Score: 16.14032140601778
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Privacy preservation has become a critical concern in high-dimensional data
analysis due to the growing prevalence of data-driven applications. Proposed by
Li (1991), sliced inverse regression has emerged as a widely utilized
statistical technique for reducing covariate dimensionality while maintaining
sufficient statistical information. In this paper, we propose optimally
differentially private algorithms specifically designed to address privacy
concerns in the context of sufficient dimension reduction. We proceed to
establish lower bounds for differentially private sliced inverse regression in
both the low and high-dimensional settings. Moreover, we develop differentially
private algorithms that achieve the minimax lower bounds up to logarithmic
factors. Through a combination of simulations and real data analysis, we
illustrate the efficacy of these differentially private algorithms in
safeguarding privacy while preserving vital information within the reduced
dimension space. As a natural extension, we can readily offer analogous lower
and upper bounds for differentially private sparse principal component
analysis, a topic that may also be of potential interest to the statistical and
machine learning community.
Related papers
- The Data Minimization Principle in Machine Learning [61.17813282782266]
Data minimization aims to reduce the amount of data collected, processed or retained.
It has been endorsed by various global data protection regulations.
However, its practical implementation remains a challenge due to the lack of a rigorous formulation.
arXiv Detail & Related papers (2024-05-29T19:40:27Z) - Initialization Matters: Privacy-Utility Analysis of Overparameterized
Neural Networks [72.51255282371805]
We prove a privacy bound for the KL divergence between model distributions on worst-case neighboring datasets.
We find that this KL privacy bound is largely determined by the expected squared gradient norm relative to model parameters during training.
arXiv Detail & Related papers (2023-10-31T16:13:22Z) - Differentially private sliced inverse regression in the federated
paradigm [3.539008590223188]
We extend Sliced inverse regression (SIR) to address the challenges of decentralized data, prioritizing privacy and communication efficiency.
Our approach, named as federated sliced inverse regression (FSIR), facilitates collaborative estimation of the sufficient dimension reduction subspace among multiple clients.
arXiv Detail & Related papers (2023-06-10T00:32:39Z) - Score Attack: A Lower Bound Technique for Optimal Differentially Private
Learning [8.760651633031342]
We propose a novel approach called the score attack, which provides a lower bound on the differential-privacy-constrained minimax risk of parameter estimation.
It can optimally lower bound the minimax risk of estimating unknown model parameters, up to a logarithmic factor, while ensuring differential privacy for a range of statistical problems.
arXiv Detail & Related papers (2023-03-13T14:26:27Z) - On Differential Privacy and Adaptive Data Analysis with Bounded Space [76.10334958368618]
We study the space complexity of the two related fields of differential privacy and adaptive data analysis.
We show that there exists a problem P that requires exponentially more space to be solved efficiently with differential privacy.
The line of work on adaptive data analysis focuses on understanding the number of samples needed for answering a sequence of adaptive queries.
arXiv Detail & Related papers (2023-02-11T14:45:31Z) - Differentially Private Stochastic Gradient Descent with Low-Noise [49.981789906200035]
Modern machine learning algorithms aim to extract fine-grained information from data to provide accurate predictions, which often conflicts with the goal of privacy protection.
This paper addresses the practical and theoretical importance of developing privacy-preserving machine learning algorithms that ensure good performance while preserving privacy.
arXiv Detail & Related papers (2022-09-09T08:54:13Z) - Decentralized Stochastic Optimization with Inherent Privacy Protection [103.62463469366557]
Decentralized optimization is the basic building block of modern collaborative machine learning, distributed estimation and control, and large-scale sensing.
Since involved data, privacy protection has become an increasingly pressing need in the implementation of decentralized optimization algorithms.
arXiv Detail & Related papers (2022-05-08T14:38:23Z) - Graph-Homomorphic Perturbations for Private Decentralized Learning [64.26238893241322]
Local exchange of estimates allows inference of data based on private data.
perturbations chosen independently at every agent, resulting in a significant performance loss.
We propose an alternative scheme, which constructs perturbations according to a particular nullspace condition, allowing them to be invisible.
arXiv Detail & Related papers (2020-10-23T10:35:35Z) - Differentially Private Simple Linear Regression [2.614403183902121]
We study algorithms for simple linear regression that satisfy differential privacy.
We consider the design of differentially private algorithms for simple linear regression for small datasets.
We study the performance of a spectrum of algorithms we adapt to the setting.
arXiv Detail & Related papers (2020-07-10T04:28:43Z) - Designing Differentially Private Estimators in High Dimensions [0.0]
We study differentially private mean estimation in a high-dimensional setting.
Recent work in high-dimensional robust statistics has identified computationally tractable mean estimation algorithms.
arXiv Detail & Related papers (2020-06-02T21:17:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.