Convolutional Neural Network Pruning Using Filter Attenuation
- URL: http://arxiv.org/abs/2002.03299v1
- Date: Sun, 9 Feb 2020 06:31:24 GMT
- Title: Convolutional Neural Network Pruning Using Filter Attenuation
- Authors: Morteza Mousa-Pasandi, Mohsen Hajabdollahi, Nader Karimi, Shadrokh
Samavi, Shahram Shirani
- Abstract summary: Filters are essential elements in convolutional neural networks (CNNs)
In filter pruning methods, a filter with all of its components, including channels and connections, are removed.
We propose a CNN pruning method based on filter attenuation in which weak filters are not directly removed.
- Score: 10.282782377635106
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Filters are the essential elements in convolutional neural networks (CNNs).
Filters are corresponded to the feature maps and form the main part of the
computational and memory requirement for the CNN processing. In filter pruning
methods, a filter with all of its components, including channels and
connections, are removed. The removal of a filter can cause a drastic change in
the network's performance. Also, the removed filters cannot come back to the
network structure. We want to address these problems in this paper. We propose
a CNN pruning method based on filter attenuation in which weak filters are not
directly removed. Instead, weak filters are attenuated and gradually removed.
In the proposed attenuation approach, weak filters are not abruptly removed,
and there is a chance for these filters to return to the network. The filter
attenuation method is assessed using the VGG model for the Cifar10 image
classification task. Simulation results show that the filter attenuation works
with different pruning criteria, and better results are obtained in comparison
with the conventional pruning methods.
Related papers
- Structured Network Pruning by Measuring Filter-wise Interactions [6.037167142826297]
We propose a structured network pruning approach SNPFI (Structured Network Pruning by measuring Filter-wise Interaction)
During the pruning, the SNPFI can automatically assign the proper sparsity based on the filter utilization strength.
We empirically demonstrate the effectiveness of the SNPFI with several commonly used CNN models.
arXiv Detail & Related papers (2023-07-03T05:26:05Z) - Efficient CNNs via Passive Filter Pruning [23.661189257759535]
Convolutional neural networks (CNNs) have shown state-of-the-art performance in various applications.
CNNs are resource-hungry due to their requirement of high computational complexity and memory storage.
Recent efforts toward achieving computational efficiency in CNNs involve filter pruning methods.
arXiv Detail & Related papers (2023-04-05T09:19:19Z) - IterativePFN: True Iterative Point Cloud Filtering [18.51768749680731]
A fundamental 3D vision task is the removal of noise, known as point cloud filtering or denoising.
We propose IterativePFN (iterative point cloud filtering network), which consists of multiple Iterations that model the true iterative filtering process internally.
Our method is able to obtain better performance compared to state-of-the-art methods.
arXiv Detail & Related papers (2023-04-04T04:47:44Z) - Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters [151.2423480789271]
A novel pruning method, termed CLR-RNF, is proposed for filter-level network pruning.
We conduct image classification on CIFAR-10 and ImageNet to demonstrate the superiority of our CLR-RNF over the state-of-the-arts.
arXiv Detail & Related papers (2022-02-15T04:53:24Z) - Batch Normalization Tells You Which Filter is Important [49.903610684578716]
We propose a simple yet effective filter pruning method by evaluating the importance of each filter based on the BN parameters of pre-trained CNNs.
The experimental results on CIFAR-10 and ImageNet demonstrate that the proposed method can achieve outstanding performance.
arXiv Detail & Related papers (2021-12-02T12:04:59Z) - Unsharp Mask Guided Filtering [53.14430987860308]
The goal of this paper is guided image filtering, which emphasizes the importance of structure transfer during filtering.
We propose a new and simplified formulation of the guided filter inspired by unsharp masking.
Our formulation enjoys a filtering prior to a low-pass filter and enables explicit structure transfer by estimating a single coefficient.
arXiv Detail & Related papers (2021-06-02T19:15:34Z) - Data Agnostic Filter Gating for Efficient Deep Networks [72.4615632234314]
Current filter pruning methods mainly leverage feature maps to generate important scores for filters and prune those with smaller scores.
In this paper, we propose a data filter pruning method that uses an auxiliary network named Dagger module to induce pruning.
In addition, to help prune filters with certain FLOPs constraints, we leverage an explicit FLOPs-aware regularization to directly promote pruning filters toward target FLOPs.
arXiv Detail & Related papers (2020-10-28T15:26:40Z) - Filter Grafting for Deep Neural Networks: Reason, Method, and
Cultivation [86.91324735966766]
Filter is the key component in modern convolutional neural networks (CNNs)
In this paper, we introduce filter grafting (textbfMethod) to achieve this goal.
We develop a novel criterion to measure the information of filters and an adaptive weighting strategy to balance the grafted information among networks.
arXiv Detail & Related papers (2020-04-26T08:36:26Z) - Filter Grafting for Deep Neural Networks [71.39169475500324]
Filter grafting aims to improve the representation capability of Deep Neural Networks (DNNs)
We develop an entropy-based criterion to measure the information of filters and an adaptive weighting strategy for balancing the grafted information among networks.
For example, the grafted MobileNetV2 outperforms the non-grafted MobileNetV2 by about 7 percent on CIFAR-100 dataset.
arXiv Detail & Related papers (2020-01-15T03:18:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.