Blending Pruning Criteria for Convolutional Neural Networks
- URL: http://arxiv.org/abs/2107.05033v1
- Date: Sun, 11 Jul 2021 12:34:19 GMT
- Title: Blending Pruning Criteria for Convolutional Neural Networks
- Authors: Wei He, Zhongzhan Huang, Mingfu Liang, Senwei Liang, Haizhao Yang
- Abstract summary: Recent popular network pruning is an effective method to reduce the redundancy of the models.
One filter could be important according to a certain criterion, while it is unnecessary according to another one, which indicates that each criterion is only a partial view of the comprehensive "importance"
We propose a novel framework to integrate the existing filter pruning criteria by exploring the criteria diversity.
- Score: 13.259106518678474
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The advancement of convolutional neural networks (CNNs) on various vision
applications has attracted lots of attention. Yet the majority of CNNs are
unable to satisfy the strict requirement for real-world deployment. To overcome
this, the recent popular network pruning is an effective method to reduce the
redundancy of the models. However, the ranking of filters according to their
"importance" on different pruning criteria may be inconsistent. One filter
could be important according to a certain criterion, while it is unnecessary
according to another one, which indicates that each criterion is only a partial
view of the comprehensive "importance". From this motivation, we propose a
novel framework to integrate the existing filter pruning criteria by exploring
the criteria diversity. The proposed framework contains two stages: Criteria
Clustering and Filters Importance Calibration. First, we condense the pruning
criteria via layerwise clustering based on the rank of "importance" score.
Second, within each cluster, we propose a calibration factor to adjust their
significance for each selected blending candidates and search for the optimal
blending criterion via Evolutionary Algorithm. Quantitative results on the
CIFAR-100 and ImageNet benchmarks show that our framework outperforms the
state-of-the-art baselines, regrading to the compact model performance after
pruning.
Related papers
- Fine-grained Recognition with Learnable Semantic Data Augmentation [68.48892326854494]
Fine-grained image recognition is a longstanding computer vision challenge.
We propose diversifying the training data at the feature-level to alleviate the discriminative region loss problem.
Our method significantly improves the generalization performance on several popular classification networks.
arXiv Detail & Related papers (2023-09-01T11:15:50Z) - WHC: Weighted Hybrid Criterion for Filter Pruning on Convolutional
Neural Networks [6.741182160506872]
Filter pruning has attracted increasing attention in recent years for its capacity in compressing and accelerating convolutional neural networks.
Various data-independent criteria, including norm-based and relationship-based ones, were proposed to prune the most unimportant filters.
We introduce a new data-independent criterion, Weighted Hybrid Criterion, to tackle the problems of both norm-based and relationship-based criteria.
arXiv Detail & Related papers (2023-02-16T10:10:40Z) - Improve Convolutional Neural Network Pruning by Maximizing Filter
Variety [0.0]
Neural network pruning is a widely used strategy for reducing model storage and computing requirements.
Common pruning criteria, such as l1-norm or movement, usually do not consider the individual utility of filters.
We present a technique solving those two issues, and which can be appended to any pruning criteria.
arXiv Detail & Related papers (2022-03-11T09:00:59Z) - Rethinking Counting and Localization in Crowds:A Purely Point-Based
Framework [59.578339075658995]
We propose a purely point-based framework for joint crowd counting and individual localization.
We design an intuitive solution under this framework, which is called Point to Point Network (P2PNet)
arXiv Detail & Related papers (2021-07-27T11:41:50Z) - CRACT: Cascaded Regression-Align-Classification for Robust Visual
Tracking [97.84109669027225]
We introduce an improved proposal refinement module, Cascaded Regression-Align- Classification (CRAC)
CRAC yields new state-of-the-art performances on many benchmarks.
In experiments on seven benchmarks including OTB-2015, UAV123, NfS, VOT-2018, TrackingNet, GOT-10k and LaSOT, our CRACT exhibits very promising results in comparison with state-of-the-art competitors.
arXiv Detail & Related papers (2020-11-25T02:18:33Z) - Discretization-Aware Architecture Search [81.35557425784026]
This paper presents discretization-aware architecture search (DAtextsuperscript2S)
The core idea is to push the super-network towards the configuration of desired topology, so that the accuracy loss brought by discretization is largely alleviated.
Experiments on standard image classification benchmarks demonstrate the superiority of our approach.
arXiv Detail & Related papers (2020-07-07T01:18:58Z) - Slimming Neural Networks using Adaptive Connectivity Scores [28.872080203221934]
We propose a new single-shot, fully automated pruning algorithm called Slimming Neural networks using Adaptive Connectivity Scores (SNACS)
Our proposed approach combines a probabilistic pruning framework with constraints on the underlying weight matrices.
SNACS is faster by over 17x the nearest comparable method.
arXiv Detail & Related papers (2020-06-22T17:45:16Z) - Dependency Aware Filter Pruning [74.69495455411987]
Pruning a proportion of unimportant filters is an efficient way to mitigate the inference cost.
Previous work prunes filters according to their weight norms or the corresponding batch-norm scaling factors.
We propose a novel mechanism to dynamically control the sparsity-inducing regularization so as to achieve the desired sparsity.
arXiv Detail & Related papers (2020-05-06T07:41:22Z) - Convolution-Weight-Distribution Assumption: Rethinking the Criteria of
Channel Pruning [90.2947802490534]
We find two blind spots in the study of pruning criteria.
The ranks of filters'Importance Score are almost identical, resulting in similar pruned structures.
The filters'Importance Score measured by some pruning criteria are too close to distinguish the network redundancy well.
arXiv Detail & Related papers (2020-04-24T09:54:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.