RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting
and Output Merging
- URL: http://arxiv.org/abs/2110.01397v1
- Date: Thu, 30 Sep 2021 09:31:11 GMT
- Title: RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting
and Output Merging
- Authors: Edouard Yvinec, Arnaud Dapogny, Matthieu Cord and Kevin Bailly
- Abstract summary: Pruning Deep Neural Networks (DNNs) is a prominent field of study in the goal of inference runtime acceleration.
We introduce a novel data-free pruning protocol RED++.
We study the theoretical and empirical guarantees on the preservation of the accuracy from the hashing.
- Score: 36.027765880474526
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Pruning Deep Neural Networks (DNNs) is a prominent field of study in the goal
of inference runtime acceleration. In this paper, we introduce a novel
data-free pruning protocol RED++. Only requiring a trained neural network, and
not specific to DNN architecture, we exploit an adaptive data-free scalar
hashing which exhibits redundancies among neuron weight values. We study the
theoretical and empirical guarantees on the preservation of the accuracy from
the hashing as well as the expected pruning ratio resulting from the
exploitation of said redundancies. We propose a novel data-free pruning
technique of DNN layers which removes the input-wise redundant operations. This
algorithm is straightforward, parallelizable and offers novel perspective on
DNN pruning by shifting the burden of large computation to efficient memory
access and allocation. We provide theoretical guarantees on RED++ performance
and empirically demonstrate its superiority over other data-free pruning
methods and its competitiveness with data-driven ones on ResNets, MobileNets
and EfficientNets.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.