A Statistical Framework for Low-bitwidth Training of Deep Neural
Networks
- URL: http://arxiv.org/abs/2010.14298v1
- Date: Tue, 27 Oct 2020 13:57:33 GMT
- Title: A Statistical Framework for Low-bitwidth Training of Deep Neural
Networks
- Authors: Jianfei Chen, Yu Gai, Zhewei Yao, Michael W. Mahoney, Joseph E.
Gonzalez
- Abstract summary: Fully quantized training (FQT) uses low-bitwidth hardware by quantizing the activations, weights, and gradients of a neural network model.
One major challenge with FQT is the lack of theoretical understanding, in particular of how gradient quantization impacts convergence properties.
- Score: 70.77754244060384
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Fully quantized training (FQT), which uses low-bitwidth hardware by
quantizing the activations, weights, and gradients of a neural network model,
is a promising approach to accelerate the training of deep neural networks. One
major challenge with FQT is the lack of theoretical understanding, in
particular of how gradient quantization impacts convergence properties. In this
paper, we address this problem by presenting a statistical framework for
analyzing FQT algorithms. We view the quantized gradient of FQT as a stochastic
estimator of its full precision counterpart, a procedure known as
quantization-aware training (QAT). We show that the FQT gradient is an unbiased
estimator of the QAT gradient, and we discuss the impact of gradient
quantization on its variance. Inspired by these theoretical results, we develop
two novel gradient quantizers, and we show that these have smaller variance
than the existing per-tensor quantizer. For training ResNet-50 on ImageNet, our
5-bit block Householder quantizer achieves only 0.5% validation accuracy loss
relative to QAT, comparable to the existing INT8 baseline.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.