Batch Normalization Tells You Which Filter is Important
- URL: http://arxiv.org/abs/2112.01155v1
- Date: Thu, 2 Dec 2021 12:04:59 GMT
- Title: Batch Normalization Tells You Which Filter is Important
- Authors: Junghun Oh, Heewon Kim, Sungyong Baik, Cheeun Hong and Kyoung Mu Lee
- Abstract summary: We propose a simple yet effective filter pruning method by evaluating the importance of each filter based on the BN parameters of pre-trained CNNs.
The experimental results on CIFAR-10 and ImageNet demonstrate that the proposed method can achieve outstanding performance.
- Score: 49.903610684578716
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The goal of filter pruning is to search for unimportant filters to remove in
order to make convolutional neural networks (CNNs) efficient without
sacrificing the performance in the process. The challenge lies in finding
information that can help determine how important or relevant each filter is
with respect to the final output of neural networks. In this work, we share our
observation that the batch normalization (BN) parameters of pre-trained CNNs
can be used to estimate the feature distribution of activation outputs, without
processing of training data. Upon observation, we propose a simple yet
effective filter pruning method by evaluating the importance of each filter
based on the BN parameters of pre-trained CNNs. The experimental results on
CIFAR-10 and ImageNet demonstrate that the proposed method can achieve
outstanding performance with and without fine-tuning in terms of the trade-off
between the accuracy drop and the reduction in computational complexity and
number of parameters of pruned networks.
Related papers
- Structured Network Pruning by Measuring Filter-wise Interactions [6.037167142826297]
We propose a structured network pruning approach SNPFI (Structured Network Pruning by measuring Filter-wise Interaction)
During the pruning, the SNPFI can automatically assign the proper sparsity based on the filter utilization strength.
We empirically demonstrate the effectiveness of the SNPFI with several commonly used CNN models.
arXiv Detail & Related papers (2023-07-03T05:26:05Z) - Efficient CNNs via Passive Filter Pruning [23.661189257759535]
Convolutional neural networks (CNNs) have shown state-of-the-art performance in various applications.
CNNs are resource-hungry due to their requirement of high computational complexity and memory storage.
Recent efforts toward achieving computational efficiency in CNNs involve filter pruning methods.
arXiv Detail & Related papers (2023-04-05T09:19:19Z) - Asymptotic Soft Cluster Pruning for Deep Neural Networks [5.311178623385279]
Filter pruning method introduces structural sparsity by removing selected filters.
We propose a novel filter pruning method called Asymptotic Soft Cluster Pruning.
Our method can achieve competitive results compared with many state-of-the-art algorithms.
arXiv Detail & Related papers (2022-06-16T13:58:58Z) - A Passive Similarity based CNN Filter Pruning for Efficient Acoustic
Scene Classification [23.661189257759535]
We present a method to develop low-complexity convolutional neural networks (CNNs) for acoustic scene classification (ASC)
We propose a passive filter pruning framework, where a few convolutional filters from the CNNs are eliminated to yield compressed CNNs.
The proposed method is simple, reduces computations per inference by 27%, with 25% fewer parameters, with less than 1% drop in accuracy.
arXiv Detail & Related papers (2022-03-29T17:00:06Z) - Unsharp Mask Guided Filtering [53.14430987860308]
The goal of this paper is guided image filtering, which emphasizes the importance of structure transfer during filtering.
We propose a new and simplified formulation of the guided filter inspired by unsharp masking.
Our formulation enjoys a filtering prior to a low-pass filter and enables explicit structure transfer by estimating a single coefficient.
arXiv Detail & Related papers (2021-06-02T19:15:34Z) - Data Agnostic Filter Gating for Efficient Deep Networks [72.4615632234314]
Current filter pruning methods mainly leverage feature maps to generate important scores for filters and prune those with smaller scores.
In this paper, we propose a data filter pruning method that uses an auxiliary network named Dagger module to induce pruning.
In addition, to help prune filters with certain FLOPs constraints, we leverage an explicit FLOPs-aware regularization to directly promote pruning filters toward target FLOPs.
arXiv Detail & Related papers (2020-10-28T15:26:40Z) - Dependency Aware Filter Pruning [74.69495455411987]
Pruning a proportion of unimportant filters is an efficient way to mitigate the inference cost.
Previous work prunes filters according to their weight norms or the corresponding batch-norm scaling factors.
We propose a novel mechanism to dynamically control the sparsity-inducing regularization so as to achieve the desired sparsity.
arXiv Detail & Related papers (2020-05-06T07:41:22Z) - Filter Grafting for Deep Neural Networks: Reason, Method, and
Cultivation [86.91324735966766]
Filter is the key component in modern convolutional neural networks (CNNs)
In this paper, we introduce filter grafting (textbfMethod) to achieve this goal.
We develop a novel criterion to measure the information of filters and an adaptive weighting strategy to balance the grafted information among networks.
arXiv Detail & Related papers (2020-04-26T08:36:26Z) - Filter Grafting for Deep Neural Networks [71.39169475500324]
Filter grafting aims to improve the representation capability of Deep Neural Networks (DNNs)
We develop an entropy-based criterion to measure the information of filters and an adaptive weighting strategy for balancing the grafted information among networks.
For example, the grafted MobileNetV2 outperforms the non-grafted MobileNetV2 by about 7 percent on CIFAR-100 dataset.
arXiv Detail & Related papers (2020-01-15T03:18:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.