SSGD: A safe and efficient method of gradient descent
- URL: http://arxiv.org/abs/2012.02076v2
- Date: Mon, 26 Apr 2021 04:33:08 GMT
- Title: SSGD: A safe and efficient method of gradient descent
- Authors: Jinhuan Duan, Xianxian Li, Shiqi Gao, Jinyan Wang and Zili Zhong
- Abstract summary: gradient descent method plays an important role in solving various optimization problems.
Super gradient descent approach to update parameters by concealing the length of gradient.
Our algorithm can defend against attacks on the gradient.
- Score: 0.5099811144731619
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the vigorous development of artificial intelligence technology, various
engineering technology applications have been implemented one after another.
The gradient descent method plays an important role in solving various
optimization problems, due to its simple structure, good stability and easy
implementation. In multi-node machine learning system, the gradients usually
need to be shared. Shared gradients are generally unsafe. Attackers can obtain
training data simply by knowing the gradient information. In this paper, to
prevent gradient leakage while keeping the accuracy of model, we propose the
super stochastic gradient descent approach to update parameters by concealing
the modulus length of gradient vectors and converting it or them into a unit
vector. Furthermore, we analyze the security of super stochastic gradient
descent approach. Our algorithm can defend against attacks on the gradient.
Experiment results show that our approach is obviously superior to prevalent
gradient descent approaches in terms of accuracy, robustness, and adaptability
to large-scale batches.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.