ClassPruning: Speed Up Image Restoration Networks by Dynamic N:M Pruning
- URL: http://arxiv.org/abs/2211.05488v1
- Date: Thu, 10 Nov 2022 11:14:15 GMT
- Title: ClassPruning: Speed Up Image Restoration Networks by Dynamic N:M Pruning
- Authors: Yang Zhou, Yuda Song, Hui Qian, Xin Du
- Abstract summary: ClassPruning can help existing methods save approximately 40% FLOPs while maintaining performance.
We propose a novel training strategy along with two additional loss terms to stabilize training and improve performance.
- Score: 25.371802581339576
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image restoration tasks have achieved tremendous performance improvements
with the rapid advancement of deep neural networks. However, most prevalent
deep learning models perform inference statically, ignoring that different
images have varying restoration difficulties and lightly degraded images can be
well restored by slimmer subnetworks. To this end, we propose a new solution
pipeline dubbed ClassPruning that utilizes networks with different capabilities
to process images with varying restoration difficulties. In particular, we use
a lightweight classifier to identify the image restoration difficulty, and then
the sparse subnetworks with different capabilities can be sampled based on
predicted difficulty by performing dynamic N:M fine-grained structured pruning
on base restoration networks. We further propose a novel training strategy
along with two additional loss terms to stabilize training and improve
performance. Experiments demonstrate that ClassPruning can help existing
methods save approximately 40% FLOPs while maintaining performance.
Related papers
- Multi-Scale Representation Learning for Image Restoration with State-Space Model [13.622411683295686]
We propose a novel Multi-Scale State-Space Model-based (MS-Mamba) for efficient image restoration.
Our proposed method achieves new state-of-the-art performance while maintaining low computational complexity.
arXiv Detail & Related papers (2024-08-19T16:42:58Z) - Unified-Width Adaptive Dynamic Network for All-In-One Image Restoration [50.81374327480445]
We introduce a novel concept positing that intricate image degradation can be represented in terms of elementary degradation.
We propose the Unified-Width Adaptive Dynamic Network (U-WADN), consisting of two pivotal components: a Width Adaptive Backbone (WAB) and a Width Selector (WS)
The proposed U-WADN achieves better performance while simultaneously reducing up to 32.3% of FLOPs and providing approximately 15.7% real-time acceleration.
arXiv Detail & Related papers (2024-01-24T04:25:12Z) - DGNet: Dynamic Gradient-Guided Network for Water-Related Optics Image
Enhancement [77.0360085530701]
Underwater image enhancement (UIE) is a challenging task due to the complex degradation caused by underwater environments.
Previous methods often idealize the degradation process, and neglect the impact of medium noise and object motion on the distribution of image features.
Our approach utilizes predicted images to dynamically update pseudo-labels, adding a dynamic gradient to optimize the network's gradient space.
arXiv Detail & Related papers (2023-12-12T06:07:21Z) - Wide & deep learning for spatial & intensity adaptive image restoration [16.340992967330603]
We propose an ingenious and efficient multi-frame image restoration network (DparNet) with wide & deep architecture.
The degradation prior is directly learned from degraded images in form of key degradation parameter matrix.
The wide & deep architecture in DparNet enables the learned parameters to directly modulate the final restoring results.
arXiv Detail & Related papers (2023-05-30T03:24:09Z) - Random Weights Networks Work as Loss Prior Constraint for Image
Restoration [50.80507007507757]
We present our belief Random Weights Networks can be Acted as Loss Prior Constraint for Image Restoration''
Our belief can be directly inserted into existing networks without any training and testing computational cost.
To emphasize, our main focus is to spark the realms of loss function and save their current neglected status.
arXiv Detail & Related papers (2023-03-29T03:43:51Z) - Variational Deep Image Restoration [20.195082841065947]
This paper presents a new variational inference framework for image restoration and a convolutional neural network (CNN) structure that can solve the restoration problems described by the proposed framework.
Specifically, our method delivers state-of-the-art performance on Gaussian denoising, real-world noise reduction, blind image super-resolution, and JPEG compression artifacts reduction.
arXiv Detail & Related papers (2022-07-03T16:32:15Z) - Attentive Fine-Grained Structured Sparsity for Image Restoration [63.35887911506264]
N:M structured pruning has appeared as one of the effective and practical pruning approaches for making the model efficient with the accuracy constraint.
We propose a novel pruning method that determines the pruning ratio for N:M structured sparsity at each layer.
arXiv Detail & Related papers (2022-04-26T12:44:55Z) - Is Deep Image Prior in Need of a Good Education? [57.3399060347311]
Deep image prior was introduced as an effective prior for image reconstruction.
Despite its impressive reconstructive properties, the approach is slow when compared to learned or traditional reconstruction techniques.
We develop a two-stage learning paradigm to address the computational challenge.
arXiv Detail & Related papers (2021-11-23T15:08:26Z) - Content-adaptive Representation Learning for Fast Image Super-resolution [6.5468866820512215]
We adrress the efficiency issue in image SR by incorporating a patch-wise rolling network to content-adaptively recover images according to difficulty levels.
In contrast to existing studies that ignore difficulty diversity, we adopt different stage of a neural network to perform image restoration.
Our model not only shows a significant acceleration but also maintain state-of-the-art performance.
arXiv Detail & Related papers (2021-05-20T10:24:29Z) - Learning degraded image classification with restoration data fidelity [0.0]
We investigate the influence of degradation types and levels on four widely-used classification networks.
We propose a novel method leveraging a fidelity map to calibrate the image features obtained by pre-trained networks.
Our results reveal that the proposed method is a promising solution to mitigate the effect caused by image degradation.
arXiv Detail & Related papers (2021-01-23T23:47:03Z) - BP-DIP: A Backprojection based Deep Image Prior [49.375539602228415]
We propose two image restoration approaches: (i) Deep Image Prior (DIP), which trains a convolutional neural network (CNN) from scratch in test time using the degraded image; and (ii) a backprojection (BP) fidelity term, which is an alternative to the standard least squares loss that is usually used in previous DIP works.
We demonstrate the performance of the proposed method, termed BP-DIP, on the deblurring task and show its advantages over the plain DIP, with both higher PSNR values and better inference run-time.
arXiv Detail & Related papers (2020-03-11T17:09:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.