Bidirectional Multi-Scale Implicit Neural Representations for Image Deraining
- URL: http://arxiv.org/abs/2404.01547v1
- Date: Tue, 2 Apr 2024 01:18:16 GMT
- Title: Bidirectional Multi-Scale Implicit Neural Representations for Image Deraining
- Authors: Xiang Chen, Jinshan Pan, Jiangxin Dong,
- Abstract summary: We develop an end-to-end multi-scale Transformer to facilitate high-quality image reconstruction.
We incorporate intra-scale implicit neural representations based on pixel coordinates with the degraded inputs in a closed-loop design.
Our approach, named as NeRD-Rain, performs favorably against the state-of-the-art ones on both synthetic and real-world benchmark datasets.
- Score: 47.15857899099733
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: How to effectively explore multi-scale representations of rain streaks is important for image deraining. In contrast to existing Transformer-based methods that depend mostly on single-scale rain appearance, we develop an end-to-end multi-scale Transformer that leverages the potentially useful features in various scales to facilitate high-quality image reconstruction. To better explore the common degradation representations from spatially-varying rain streaks, we incorporate intra-scale implicit neural representations based on pixel coordinates with the degraded inputs in a closed-loop design, enabling the learned features to facilitate rain removal and improve the robustness of the model in complex scenarios. To ensure richer collaborative representation from different scales, we embed a simple yet effective inter-scale bidirectional feedback operation into our multi-scale Transformer by performing coarse-to-fine and fine-to-coarse information communication. Extensive experiments demonstrate that our approach, named as NeRD-Rain, performs favorably against the state-of-the-art ones on both synthetic and real-world benchmark datasets. The source code and trained models are available at https://github.com/cschenxiang/NeRD-Rain.
Related papers
- Dual-Path Multi-Scale Transformer for High-Quality Image Deraining [1.7104836047593197]
We propose a dual-path multi-scale Transformer (DPMformer) for high-quality image reconstruction.
This method consists of a backbone path and two branch paths from two different multi-scale approaches.
Our method achieves promising performance compared to other state-of-the-art methods.
arXiv Detail & Related papers (2024-05-28T12:31:23Z) - RainyScape: Unsupervised Rainy Scene Reconstruction using Decoupled Neural Rendering [50.14860376758962]
We propose RainyScape, an unsupervised framework for reconstructing clean scenes from a collection of multi-view rainy images.
Based on the spectral bias property of neural networks, we first optimize the neural rendering pipeline to obtain a low-frequency scene representation.
We jointly optimize the two modules, driven by the proposed adaptive direction-sensitive gradient-based reconstruction loss.
arXiv Detail & Related papers (2024-04-17T14:07:22Z) - Look-Around Before You Leap: High-Frequency Injected Transformer for Image Restoration [46.96362010335177]
In this paper, we propose HIT, a simple yet effective High-frequency Injected Transformer for image restoration.
Specifically, we design a window-wise injection module (WIM), which incorporates abundant high-frequency details into the feature map, to provide reliable references for restoring high-quality images.
In addition, we introduce a spatial enhancement unit (SEU) to preserve essential spatial relationships that may be lost due to the computations carried out across channel dimensions in the BIM.
arXiv Detail & Related papers (2024-03-30T08:05:00Z) - Contrastive Learning Based Recursive Dynamic Multi-Scale Network for
Image Deraining [47.764883957379745]
Rain streaks significantly decrease the visibility of captured images.
Existing deep learning-based image deraining methods employ manually crafted networks and learn a straightforward projection from rainy images to clear images.
We propose a contrastive learning-based image deraining method that investigates the correlation between rainy and clear images.
arXiv Detail & Related papers (2023-05-29T13:51:41Z) - Online-updated High-order Collaborative Networks for Single Image
Deraining [51.22694467126883]
Single image deraining is an important task for some downstream artificial intelligence applications such as video surveillance and self-driving systems.
We propose a high-order collaborative network with multi-scale compact constraints and a bidirectional scale-content similarity mining module.
Our proposed method performs favorably against eleven state-of-the-art methods on five public synthetic and one real-world dataset.
arXiv Detail & Related papers (2022-02-14T09:09:08Z) - CSformer: Bridging Convolution and Transformer for Compressive Sensing [65.22377493627687]
This paper proposes a hybrid framework that integrates the advantages of leveraging detailed spatial information from CNN and the global context provided by transformer for enhanced representation learning.
The proposed approach is an end-to-end compressive image sensing method, composed of adaptive sampling and recovery.
The experimental results demonstrate the effectiveness of the dedicated transformer-based architecture for compressive sensing.
arXiv Detail & Related papers (2021-12-31T04:37:11Z) - Multi-Scale Hourglass Hierarchical Fusion Network for Single Image
Deraining [8.964751500091005]
Rain streaks bring serious blurring and visual quality degradation, which often vary in size, direction and density.
Current CNN-based methods achieve encouraging performance, while are limited to depict rain characteristics and recover image details in the poor visibility environment.
We present a Multi-scale Hourglass Hierarchical Fusion Network (MH2F-Net) in end-to-end manner, to exactly captures rain streak features with multi-scale extraction, hierarchical distillation and information aggregation.
arXiv Detail & Related papers (2021-04-25T08:27:01Z) - Single Image Deraining via Scale-space Invariant Attention Neural
Network [58.5284246878277]
We tackle the notion of scale that deals with visual changes in appearance of rain steaks with respect to the camera.
We propose to represent the multi-scale correlation in convolutional feature domain, which is more compact and robust than that in pixel domain.
In this way, we summarize the most activated presence of feature maps as the salient features.
arXiv Detail & Related papers (2020-06-09T04:59:26Z) - Multi-Scale Progressive Fusion Network for Single Image Deraining [84.0466298828417]
Rain streaks in the air appear in various blurring degrees and resolutions due to different distances from their positions to the camera.
Similar rain patterns are visible in a rain image as well as its multi-scale (or multi-resolution) versions.
In this work, we explore the multi-scale collaborative representation for rain streaks from the perspective of input image scales and hierarchical deep features.
arXiv Detail & Related papers (2020-03-24T17:22:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.