Inference and Interference: The Role of Clipping, Pruning and Loss
Landscapes in Differentially Private Stochastic Gradient Descent
- URL: http://arxiv.org/abs/2311.06839v1
- Date: Sun, 12 Nov 2023 13:31:35 GMT
- Title: Inference and Interference: The Role of Clipping, Pruning and Loss
Landscapes in Differentially Private Stochastic Gradient Descent
- Authors: Lauren Watson, Eric Gan, Mohan Dantam, Baharan Mirzasoleiman, Rik
Sarkar
- Abstract summary: Differentially private gradient descent (DP-SGD) is known to have poorer training and test performance on large neural networks.
We compare the behavior of the two processes separately in early and late epochs.
We find that while DP-SGD makes slower progress in early stages, it is the behavior in the later stages that determines the end result.
- Score: 13.27004430044574
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Differentially private stochastic gradient descent (DP-SGD) is known to have
poorer training and test performance on large neural networks, compared to
ordinary stochastic gradient descent (SGD). In this paper, we perform a
detailed study and comparison of the two processes and unveil several new
insights. By comparing the behavior of the two processes separately in early
and late epochs, we find that while DP-SGD makes slower progress in early
stages, it is the behavior in the later stages that determines the end result.
This separate analysis of the clipping and noise addition steps of DP-SGD shows
that while noise introduces errors to the process, gradient descent can recover
from these errors when it is not clipped, and clipping appears to have a larger
impact than noise. These effects are amplified in higher dimensions (large
neural networks), where the loss basin occupies a lower dimensional space. We
argue theoretically and using extensive experiments that magnitude pruning can
be a suitable dimension reduction technique in this regard, and find that heavy
pruning can improve the test accuracy of DPSGD.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.