Efficient Hyperparameter Optimization for Differentially Private Deep
Learning
- URL: http://arxiv.org/abs/2108.03888v1
- Date: Mon, 9 Aug 2021 09:18:22 GMT
- Title: Efficient Hyperparameter Optimization for Differentially Private Deep
Learning
- Authors: Aman Priyanshu, Rakshit Naidu, Fatemehsadat Mireshghallah, Mohammad
Malekzadeh
- Abstract summary: We formulate a general optimization framework for establishing a desirable privacy-utility tradeoff.
We study three cost-effective algorithms for being used in the proposed framework: evolutionary, Bayesian, and reinforcement learning.
As we believe our work has implications to be utilized in the pipeline of private deep learning, we open-source our code at https://github.com/AmanPriyanshu/DP-HyperparamTuning.
- Score: 1.7205106391379026
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Tuning the hyperparameters in the differentially private stochastic gradient
descent (DPSGD) is a fundamental challenge. Unlike the typical SGD, private
datasets cannot be used many times for hyperparameter search in DPSGD; e.g.,
via a grid search. Therefore, there is an essential need for algorithms that,
within a given search space, can find near-optimal hyperparameters for the best
achievable privacy-utility tradeoffs efficiently. We formulate this problem
into a general optimization framework for establishing a desirable
privacy-utility tradeoff, and systematically study three cost-effective
algorithms for being used in the proposed framework: evolutionary, Bayesian,
and reinforcement learning. Our experiments, for hyperparameter tuning in DPSGD
conducted on MNIST and CIFAR-10 datasets, show that these three algorithms
significantly outperform the widely used grid search baseline. As this paper
offers a first-of-a-kind framework for hyperparameter tuning in DPSGD, we
discuss existing challenges and open directions for future studies. As we
believe our work has implications to be utilized in the pipeline of private
deep learning, we open-source our code at
https://github.com/AmanPriyanshu/DP-HyperparamTuning.
Related papers
- Deep Learning to Predict Late-Onset Breast Cancer Metastasis: the Single Hyperparameter Grid Search (SHGS) Strategy for Meta Tuning Concerning Deep Feed-forward Neural Network [7.332652485849632]
We have been dedicated to constructing a DFNN model to predict breast cancer metastasis n years in advance.
The challenge lies in efficiently identifying optimal hyperparameter values through grid search, given the constraints of time and resources.
arXiv Detail & Related papers (2024-08-28T03:00:43Z) - Towards Fair and Rigorous Evaluations: Hyperparameter Optimization for Top-N Recommendation Task with Implicit Feedback [23.551171147404766]
We investigate the Top-N implicit recommendation problem and focus on optimizing the benchmark recommendation algorithm.
We propose a research methodology that follows the principles of a fair comparison.
arXiv Detail & Related papers (2024-08-14T15:56:27Z) - Differentially Private SGD Without Clipping Bias: An Error-Feedback Approach [62.000948039914135]
Using Differentially Private Gradient Descent with Gradient Clipping (DPSGD-GC) to ensure Differential Privacy (DP) comes at the cost of model performance degradation.
We propose a new error-feedback (EF) DP algorithm as an alternative to DPSGD-GC.
We establish an algorithm-specific DP analysis for our proposed algorithm, providing privacy guarantees based on R'enyi DP.
arXiv Detail & Related papers (2023-11-24T17:56:44Z) - PriorBand: Practical Hyperparameter Optimization in the Age of Deep
Learning [49.92394599459274]
We propose PriorBand, an HPO algorithm tailored to Deep Learning (DL) pipelines.
We show its robustness across a range of DL benchmarks and show its gains under informative expert input and against poor expert beliefs.
arXiv Detail & Related papers (2023-06-21T16:26:14Z) - DP-HyPO: An Adaptive Private Hyperparameter Optimization Framework [31.628466186344582]
We introduce DP-HyPO, a pioneering framework for adaptive'' private hyperparameter optimization.
We provide a comprehensive differential privacy analysis of our framework.
We empirically demonstrate the effectiveness of DP-HyPO on a diverse set of real-world datasets.
arXiv Detail & Related papers (2023-06-09T07:55:46Z) - Theoretically Principled Federated Learning for Balancing Privacy and
Utility [61.03993520243198]
We propose a general learning framework for the protection mechanisms that protects privacy via distorting model parameters.
It can achieve personalized utility-privacy trade-off for each model parameter, on each client, at each communication round in federated learning.
arXiv Detail & Related papers (2023-05-24T13:44:02Z) - Practical Differentially Private Hyperparameter Tuning with Subsampling [8.022555128083026]
We propose a new class of differentially private (DP) machine learning (ML) algorithms, where the number of random search samples is randomized itself.
We focus on lowering both the DP bounds and the computational cost of these methods by using only a random subset of the sensitive data.
We provide a R'enyi differential privacy analysis for the proposed method and experimentally show that it consistently leads to better privacy-utility trade-off.
arXiv Detail & Related papers (2023-01-27T21:01:58Z) - AUTOMATA: Gradient Based Data Subset Selection for Compute-Efficient
Hyper-parameter Tuning [72.54359545547904]
We propose a gradient-based subset selection framework for hyper- parameter tuning.
We show that using gradient-based data subsets for hyper- parameter tuning achieves significantly faster turnaround times and speedups of 3$times$-30$times$.
arXiv Detail & Related papers (2022-03-15T19:25:01Z) - Differentially Private Federated Bayesian Optimization with Distributed
Exploration [48.9049546219643]
We introduce differential privacy (DP) into the training of deep neural networks through a general framework for adding DP to iterative algorithms.
We show that DP-FTS-DE achieves high utility (competitive performance) with a strong privacy guarantee.
We also use real-world experiments to show that DP-FTS-DE induces a trade-off between privacy and utility.
arXiv Detail & Related papers (2021-10-27T04:11:06Z) - An Asymptotically Optimal Multi-Armed Bandit Algorithm and
Hyperparameter Optimization [48.5614138038673]
We propose an efficient and robust bandit-based algorithm called Sub-Sampling (SS) in the scenario of hyper parameter search evaluation.
We also develop a novel hyper parameter optimization algorithm called BOSS.
Empirical studies validate our theoretical arguments of SS and demonstrate the superior performance of BOSS on a number of applications.
arXiv Detail & Related papers (2020-07-11T03:15:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.