WHC: Weighted Hybrid Criterion for Filter Pruning on Convolutional
Neural Networks
- URL: http://arxiv.org/abs/2302.08185v1
- Date: Thu, 16 Feb 2023 10:10:40 GMT
- Title: WHC: Weighted Hybrid Criterion for Filter Pruning on Convolutional
Neural Networks
- Authors: Shaowu Chen, Weize Sun, Lei Huang
- Abstract summary: Filter pruning has attracted increasing attention in recent years for its capacity in compressing and accelerating convolutional neural networks.
Various data-independent criteria, including norm-based and relationship-based ones, were proposed to prune the most unimportant filters.
We introduce a new data-independent criterion, Weighted Hybrid Criterion, to tackle the problems of both norm-based and relationship-based criteria.
- Score: 6.741182160506872
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Filter pruning has attracted increasing attention in recent years for its
capacity in compressing and accelerating convolutional neural networks. Various
data-independent criteria, including norm-based and relationship-based ones,
were proposed to prune the most unimportant filters. However, these
state-of-the-art criteria fail to fully consider the dissimilarity of filters,
and thus might lead to performance degradation. In this paper, we first analyze
the limitation of relationship-based criteria with examples, and then introduce
a new data-independent criterion, Weighted Hybrid Criterion (WHC), to tackle
the problems of both norm-based and relationship-based criteria. By taking the
magnitude of each filter and the linear dependence between filters into
consideration, WHC can robustly recognize the most redundant filters, which can
be safely pruned without introducing severe performance degradation to
networks. Extensive pruning experiments in a simple one-shot manner demonstrate
the effectiveness of the proposed WHC. In particular, WHC can prune ResNet-50
on ImageNet with more than 42% of floating point operations reduced without any
performance loss in top-5 accuracy.
Related papers
- Structured Network Pruning by Measuring Filter-wise Interactions [6.037167142826297]
We propose a structured network pruning approach SNPFI (Structured Network Pruning by measuring Filter-wise Interaction)
During the pruning, the SNPFI can automatically assign the proper sparsity based on the filter utilization strength.
We empirically demonstrate the effectiveness of the SNPFI with several commonly used CNN models.
arXiv Detail & Related papers (2023-07-03T05:26:05Z) - Filter Pruning via Filters Similarity in Consecutive Layers [20.29555787754269]
Filter pruning is widely adopted to compress and accelerate the Convolutional Neural Networks (CNNs)
We intuitively propose a novel pruning method by explicitly leveraging the Filters Similarity in Consecutive Layers (FSCL)
Experiments demonstrate the effectiveness of FSCL, and it yields remarkable improvement over state-of-the-art on accuracy, FLOPs and parameter reduction.
arXiv Detail & Related papers (2023-04-26T09:18:38Z) - Asymptotic Soft Cluster Pruning for Deep Neural Networks [5.311178623385279]
Filter pruning method introduces structural sparsity by removing selected filters.
We propose a novel filter pruning method called Asymptotic Soft Cluster Pruning.
Our method can achieve competitive results compared with many state-of-the-art algorithms.
arXiv Detail & Related papers (2022-06-16T13:58:58Z) - Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters [151.2423480789271]
A novel pruning method, termed CLR-RNF, is proposed for filter-level network pruning.
We conduct image classification on CIFAR-10 and ImageNet to demonstrate the superiority of our CLR-RNF over the state-of-the-arts.
arXiv Detail & Related papers (2022-02-15T04:53:24Z) - Batch Normalization Tells You Which Filter is Important [49.903610684578716]
We propose a simple yet effective filter pruning method by evaluating the importance of each filter based on the BN parameters of pre-trained CNNs.
The experimental results on CIFAR-10 and ImageNet demonstrate that the proposed method can achieve outstanding performance.
arXiv Detail & Related papers (2021-12-02T12:04:59Z) - Training Compact CNNs for Image Classification using Dynamic-coded
Filter Fusion [139.71852076031962]
We present a novel filter pruning method, dubbed dynamic-coded filter fusion (DCFF)
We derive compact CNNs in a computation-economical and regularization-free manner for efficient image classification.
Our DCFF derives a compact VGGNet-16 with only 72.77M FLOPs and 1.06M parameters while reaching top-1 accuracy of 93.47%.
arXiv Detail & Related papers (2021-07-14T18:07:38Z) - Blending Pruning Criteria for Convolutional Neural Networks [13.259106518678474]
Recent popular network pruning is an effective method to reduce the redundancy of the models.
One filter could be important according to a certain criterion, while it is unnecessary according to another one, which indicates that each criterion is only a partial view of the comprehensive "importance"
We propose a novel framework to integrate the existing filter pruning criteria by exploring the criteria diversity.
arXiv Detail & Related papers (2021-07-11T12:34:19Z) - Unsharp Mask Guided Filtering [53.14430987860308]
The goal of this paper is guided image filtering, which emphasizes the importance of structure transfer during filtering.
We propose a new and simplified formulation of the guided filter inspired by unsharp masking.
Our formulation enjoys a filtering prior to a low-pass filter and enables explicit structure transfer by estimating a single coefficient.
arXiv Detail & Related papers (2021-06-02T19:15:34Z) - Dependency Aware Filter Pruning [74.69495455411987]
Pruning a proportion of unimportant filters is an efficient way to mitigate the inference cost.
Previous work prunes filters according to their weight norms or the corresponding batch-norm scaling factors.
We propose a novel mechanism to dynamically control the sparsity-inducing regularization so as to achieve the desired sparsity.
arXiv Detail & Related papers (2020-05-06T07:41:22Z) - Convolution-Weight-Distribution Assumption: Rethinking the Criteria of
Channel Pruning [90.2947802490534]
We find two blind spots in the study of pruning criteria.
The ranks of filters'Importance Score are almost identical, resulting in similar pruned structures.
The filters'Importance Score measured by some pruning criteria are too close to distinguish the network redundancy well.
arXiv Detail & Related papers (2020-04-24T09:54:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.