Channel Pruning via Automatic Structure Search
- URL: http://arxiv.org/abs/2001.08565v3
- Date: Mon, 29 Jun 2020 01:52:28 GMT
- Title: Channel Pruning via Automatic Structure Search
- Authors: Mingbao Lin, Rongrong Ji, Yuxin Zhang, Baochang Zhang, Yongjian Wu,
Yonghong Tian
- Abstract summary: We propose a new channel pruning method based on artificial bee colony algorithm (ABC), dubbed as ABCPruner.
ABCPruner has been demonstrated to be more effective, which also enables the fine-tuning to be conducted efficiently in an end-to-end manner.
- Score: 109.83642249625098
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Channel pruning is among the predominant approaches to compress deep neural
networks. To this end, most existing pruning methods focus on selecting
channels (filters) by importance/optimization or regularization based on
rule-of-thumb designs, which defects in sub-optimal pruning. In this paper, we
propose a new channel pruning method based on artificial bee colony algorithm
(ABC), dubbed as ABCPruner, which aims to efficiently find optimal pruned
structure, i.e., channel number in each layer, rather than selecting
"important" channels as previous works did. To solve the intractably huge
combinations of pruned structure for deep networks, we first propose to shrink
the combinations where the preserved channels are limited to a specific space,
thus the combinations of pruned structure can be significantly reduced. And
then, we formulate the search of optimal pruned structure as an optimization
problem and integrate the ABC algorithm to solve it in an automatic manner to
lessen human interference. ABCPruner has been demonstrated to be more
effective, which also enables the fine-tuning to be conducted efficiently in an
end-to-end manner. The source codes can be available at
https://github.com/lmbxmu/ABCPruner.
Related papers
- Dynamic Structure Pruning for Compressing CNNs [13.73717878732162]
We introduce a novel structure pruning method, termed as dynamic structure pruning, to identify optimal pruning granularities for intra-channel pruning.
The experimental results show that dynamic structure pruning achieves state-of-the-art pruning performance and better realistic acceleration on a GPU compared with channel pruning.
arXiv Detail & Related papers (2023-03-17T02:38:53Z) - Revisiting Random Channel Pruning for Neural Network Compression [159.99002793644163]
Channel (or 3D filter) pruning serves as an effective way to accelerate the inference of neural networks.
In this paper, we try to determine the channel configuration of the pruned models by random search.
We show that this simple strategy works quite well compared with other channel pruning methods.
arXiv Detail & Related papers (2022-05-11T17:59:04Z) - AdaPruner: Adaptive Channel Pruning and Effective Weights Inheritance [9.3421559369389]
We propose a pruning framework that adaptively determines the number of each layer's channels as well as the wights inheritance criteria for sub-network.
AdaPruner allows to obtain pruned network quickly, accurately and efficiently.
On ImageNet, we reduce 32.8% FLOPs of MobileNetV2 with only 0.62% decrease for top-1 accuracy, which exceeds all previous state-of-the-art channel pruning methods.
arXiv Detail & Related papers (2021-09-14T01:52:05Z) - Group Fisher Pruning for Practical Network Compression [58.25776612812883]
We present a general channel pruning approach that can be applied to various complicated structures.
We derive a unified metric based on Fisher information to evaluate the importance of a single channel and coupled channels.
Our method can be used to prune any structures including those with coupled channels.
arXiv Detail & Related papers (2021-08-02T08:21:44Z) - BWCP: Probabilistic Learning-to-Prune Channels for ConvNets via Batch
Whitening [63.081808698068365]
This work presents a probabilistic channel pruning method to accelerate Convolutional Neural Networks (CNNs)
Previous pruning methods often zero out unimportant channels in training in a deterministic manner, which reduces CNN's learning capacity and results in suboptimal performance.
We develop a probability-based pruning algorithm, called batch whitening channel pruning (BWCP), which canally discard unimportant channels by modeling the probability of a channel being activated.
arXiv Detail & Related papers (2021-05-13T17:00:05Z) - Operation-Aware Soft Channel Pruning using Differentiable Masks [51.04085547997066]
We propose a data-driven algorithm, which compresses deep neural networks in a differentiable way by exploiting the characteristics of operations.
We perform extensive experiments and achieve outstanding performance in terms of the accuracy of output networks.
arXiv Detail & Related papers (2020-07-08T07:44:00Z) - DMCP: Differentiable Markov Channel Pruning for Neural Networks [67.51334229530273]
We propose a novel differentiable method for channel pruning, named Differentiable Markov Channel Pruning (DMCP)
Our method is differentiable and can be directly optimized by gradient descent with respect to standard task loss and budget regularization.
To validate the effectiveness of our method, we perform extensive experiments on Imagenet with ResNet and MobilenetV2.
arXiv Detail & Related papers (2020-05-07T09:39:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.