HPTQ: Hardware-Friendly Post Training Quantization
- URL: http://arxiv.org/abs/2109.09113v1
- Date: Sun, 19 Sep 2021 12:45:01 GMT
- Title: HPTQ: Hardware-Friendly Post Training Quantization
- Authors: Hai Victor Habi, Reuven Peretz, Elad Cohen, Lior Dikstein, Oranit
Dror, Idit Diamant, Roy H. Jennings and Arnon Netzer
- Abstract summary: We introduce a hardware-friendly post training quantization (HPTQ) framework.
We perform a large-scale study on four tasks: classification, object detection, semantic segmentation and pose estimation.
Our experiments show that competitive results can be obtained under hardware-friendly constraints.
- Score: 6.515659231669797
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural network quantization enables the deployment of models on edge devices.
An essential requirement for their hardware efficiency is that the quantizers
are hardware-friendly: uniform, symmetric, and with power-of-two thresholds. To
the best of our knowledge, current post-training quantization methods do not
support all of these constraints simultaneously. In this work, we introduce a
hardware-friendly post training quantization (HPTQ) framework, which addresses
this problem by synergistically combining several known quantization methods.
We perform a large-scale study on four tasks: classification, object detection,
semantic segmentation and pose estimation over a wide variety of network
architectures. Our extensive experiments show that competitive results can be
obtained under hardware-friendly constraints.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.