DDR: Exploiting Deep Degradation Response as Flexible Image Descriptor
- URL: http://arxiv.org/abs/2406.08377v1
- Date: Wed, 12 Jun 2024 16:26:56 GMT
- Title: DDR: Exploiting Deep Degradation Response as Flexible Image Descriptor
- Authors: Juncheng Wu, Zhangkai Ni, Hanli Wang, Wenhan Yang, Yuyin Zhou, Shiqi Wang,
- Abstract summary: We present Deep Degradation Response ( DDR), a method to quantify changes in image deep features under varying degradation conditions.
Extensive evaluations demonstrate the versatility of DDR as an image descriptor, with strong correlations observed with key image attributes.
DDR serves as an effective unsupervised learning objective in image restoration tasks, yielding notable advancements in image deblurring and single-image super-resolution.
- Score: 54.90762380689283
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image deep features extracted by pre-trained networks are known to contain rich and informative representations. In this paper, we present Deep Degradation Response (DDR), a method to quantify changes in image deep features under varying degradation conditions. Specifically, our approach facilitates flexible and adaptive degradation, enabling the controlled synthesis of image degradation through text-driven prompts. Extensive evaluations demonstrate the versatility of DDR as an image descriptor, with strong correlations observed with key image attributes such as complexity, colorfulness, sharpness, and overall quality. Moreover, we demonstrate the efficacy of DDR across a spectrum of applications. It excels as a blind image quality assessment metric, outperforming existing methodologies across multiple datasets. Additionally, DDR serves as an effective unsupervised learning objective in image restoration tasks, yielding notable advancements in image deblurring and single-image super-resolution. Our code will be made available.
Related papers
- Preserving Full Degradation Details for Blind Image Super-Resolution [40.152015542099704]
We propose an alternative to learn degradation representations through reproducing degraded low-resolution (LR) images.
By guiding the degrader to reconstruct input LR images, full degradation information can be encoded into the representations.
Experiments show that our representations can extract accurate and highly robust degradation information.
arXiv Detail & Related papers (2024-07-01T13:54:59Z) - Unified-Width Adaptive Dynamic Network for All-In-One Image Restoration [50.81374327480445]
We introduce a novel concept positing that intricate image degradation can be represented in terms of elementary degradation.
We propose the Unified-Width Adaptive Dynamic Network (U-WADN), consisting of two pivotal components: a Width Adaptive Backbone (WAB) and a Width Selector (WS)
The proposed U-WADN achieves better performance while simultaneously reducing up to 32.3% of FLOPs and providing approximately 15.7% real-time acceleration.
arXiv Detail & Related papers (2024-01-24T04:25:12Z) - Prompt-based Ingredient-Oriented All-in-One Image Restoration [0.0]
We propose a novel data ingredient-oriented approach to tackle multiple image degradation tasks.
Specifically, we utilize a encoder to capture features and introduce prompts with degradation-specific information to guide the decoder.
Our method performs competitively to the state-of-the-art.
arXiv Detail & Related papers (2023-09-06T15:05:04Z) - All-in-one Multi-degradation Image Restoration Network via Hierarchical
Degradation Representation [47.00239809958627]
We propose a novel All-in-one Multi-degradation Image Restoration Network (AMIRNet)
AMIRNet learns a degradation representation for unknown degraded images by progressively constructing a tree structure through clustering.
This tree-structured representation explicitly reflects the consistency and discrepancy of various distortions, providing a specific clue for image restoration.
arXiv Detail & Related papers (2023-08-06T04:51:41Z) - RBSR: Efficient and Flexible Recurrent Network for Burst
Super-Resolution [57.98314517861539]
Burst super-resolution (BurstSR) aims at reconstructing a high-resolution (HR) image from a sequence of low-resolution (LR) and noisy images.
In this paper, we suggest fusing cues frame-by-frame with an efficient and flexible recurrent network.
arXiv Detail & Related papers (2023-06-30T12:14:13Z) - Rethinking Super-Resolution as Text-Guided Details Generation [21.695227836312835]
We propose a Text-Guided Super-Resolution (TGSR) framework, which can effectively utilize the information from the text and image modalities.
The proposed TGSR could generate HR image details that match the text descriptions through a coarse-to-fine process.
arXiv Detail & Related papers (2022-07-14T01:46:38Z) - Hierarchical Similarity Learning for Aliasing Suppression Image
Super-Resolution [64.15915577164894]
A hierarchical image super-resolution network (HSRNet) is proposed to suppress the influence of aliasing.
HSRNet achieves better quantitative and visual performance than other works, and remits the aliasing more effectively.
arXiv Detail & Related papers (2022-06-07T14:55:32Z) - Efficient and Degradation-Adaptive Network for Real-World Image
Super-Resolution [28.00231586840797]
Real-world image super-resolution (Real-ISR) is a challenging task due to the unknown complex degradation of real-world images.
Recent research on Real-ISR has achieved significant progress by modeling the image degradation space.
We propose an efficient degradation-adaptive super-resolution (DASR) network, whose parameters are adaptively specified by estimating the degradation of each input image.
arXiv Detail & Related papers (2022-03-27T05:59:13Z) - Towards Unsupervised Deep Image Enhancement with Generative Adversarial
Network [92.01145655155374]
We present an unsupervised image enhancement generative network (UEGAN)
It learns the corresponding image-to-image mapping from a set of images with desired characteristics in an unsupervised manner.
Results show that the proposed model effectively improves the aesthetic quality of images.
arXiv Detail & Related papers (2020-12-30T03:22:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.