Iterative Network for Image Super-Resolution
- URL: http://arxiv.org/abs/2005.09964v3
- Date: Wed, 5 Jan 2022 01:56:44 GMT
- Title: Iterative Network for Image Super-Resolution
- Authors: Yuqing Liu, Shiqi Wang, Jian Zhang, Shanshe Wang, Siwei Ma and Wen Gao
- Abstract summary: Single image super-resolution (SISR) has been greatly revitalized by the recent development of convolutional neural networks (CNN)
This paper provides a new insight on conventional SISR algorithm, and proposes a substantially different approach relying on the iterative optimization.
A novel iterative super-resolution network (ISRN) is proposed on top of the iterative optimization.
- Score: 69.07361550998318
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Single image super-resolution (SISR), as a traditional ill-conditioned
inverse problem, has been greatly revitalized by the recent development of
convolutional neural networks (CNN). These CNN-based methods generally map a
low-resolution image to its corresponding high-resolution version with
sophisticated network structures and loss functions, showing impressive
performances. This paper provides a new insight on conventional SISR algorithm,
and proposes a substantially different approach relying on the iterative
optimization. A novel iterative super-resolution network (ISRN) is proposed on
top of the iterative optimization. We first analyze the observation model of
image SR problem, inspiring a feasible solution by mimicking and fusing each
iteration in a more general and efficient manner. Considering the drawbacks of
batch normalization, we propose a feature normalization (F-Norm, FN) method to
regulate the features in network. Furthermore, a novel block with FN is
developed to improve the network representation, termed as FNB.
Residual-in-residual structure is proposed to form a very deep network, which
groups FNBs with a long skip connection for better information delivery and
stabling the training phase. Extensive experimental results on testing
benchmarks with bicubic (BI) degradation show our ISRN can not only recover
more structural information, but also achieve competitive or better PSNR/SSIM
results with much fewer parameters compared to other works. Besides BI, we
simulate the real-world degradation with blur-downscale (BD) and
downscale-noise (DN). ISRN and its extension ISRN+ both achieve better
performance than others with BD and DN degradation models.
Related papers
- Iterative Soft Shrinkage Learning for Efficient Image Super-Resolution [91.3781512926942]
Image super-resolution (SR) has witnessed extensive neural network designs from CNN to transformer architectures.
This work investigates the potential of network pruning for super-resolution iteration to take advantage of off-the-shelf network designs and reduce the underlying computational overhead.
We propose a novel Iterative Soft Shrinkage-Percentage (ISS-P) method by optimizing the sparse structure of a randomly network at each and tweaking unimportant weights with a small amount proportional to the magnitude scale on-the-fly.
arXiv Detail & Related papers (2023-03-16T21:06:13Z) - RDRN: Recursively Defined Residual Network for Image Super-Resolution [58.64907136562178]
Deep convolutional neural networks (CNNs) have obtained remarkable performance in single image super-resolution.
We propose a novel network architecture which utilizes attention blocks efficiently.
arXiv Detail & Related papers (2022-11-17T11:06:29Z) - Robust Deep Compressive Sensing with Recurrent-Residual Structural
Constraints [0.0]
Existing deep sensing (CS) methods either ignore adaptive online optimization or depend on costly iterative reconstruction.
This work explores a novel image CS framework with recurrent-residual structural constraint, termed as R$2$CS-NET.
As the first deep CS framework efficiently bridging adaptive online optimization, the R$2$CS-NET integrates the robustness of online optimization with the efficiency and nonlinear capacity of deep learning methods.
arXiv Detail & Related papers (2022-07-15T05:56:13Z) - Image Superresolution using Scale-Recurrent Dense Network [30.75380029218373]
Recent advances in the design of convolutional neural network (CNN) have yielded significant improvements in the performance of image super-resolution (SR)
We propose a scale recurrent SR architecture built upon units containing series of dense connections within a residual block (Residual Dense Blocks (RDBs))
Our scale recurrent design delivers competitive performance for higher scale factors while being parametrically more efficient as compared to current state-of-the-art approaches.
arXiv Detail & Related papers (2022-01-28T09:18:43Z) - DeepRLS: A Recurrent Network Architecture with Least Squares Implicit
Layers for Non-blind Image Deconvolution [15.986942312624]
We study the problem of non-blind image deconvolution.
We propose a novel recurrent network architecture that leads to very competitive restoration results of high image quality.
arXiv Detail & Related papers (2021-12-10T13:16:51Z) - Over-and-Under Complete Convolutional RNN for MRI Reconstruction [57.95363471940937]
Recent deep learning-based methods for MR image reconstruction usually leverage a generic auto-encoder architecture.
We propose an Over-and-Under Complete Convolu?tional Recurrent Neural Network (OUCR), which consists of an overcomplete and an undercomplete Convolutional Recurrent Neural Network(CRNN)
The proposed method achieves significant improvements over the compressed sensing and popular deep learning-based methods with less number of trainable parameters.
arXiv Detail & Related papers (2021-06-16T15:56:34Z) - Feedback Pyramid Attention Networks for Single Image Super-Resolution [37.58180059860872]
We propose feedback pyramid attention networks (FPAN) to fully exploit the mutual dependencies of features.
In our method, the output of each layer in the first stage is also used as the input of the corresponding layer in the next state to re-update the previous low-level filters.
We introduce a pyramid non-local structure to model global contextual information in different scales and improve the discriminative representation of the network.
arXiv Detail & Related papers (2021-06-13T11:32:53Z) - Image Super-Resolution with Cross-Scale Non-Local Attention and
Exhaustive Self-Exemplars Mining [66.82470461139376]
We propose the first Cross-Scale Non-Local (CS-NL) attention module with integration into a recurrent neural network.
By combining the new CS-NL prior with local and in-scale non-local priors in a powerful recurrent fusion cell, we can find more cross-scale feature correlations within a single low-resolution image.
arXiv Detail & Related papers (2020-06-02T07:08:58Z) - Deep Adaptive Inference Networks for Single Image Super-Resolution [72.7304455761067]
Single image super-resolution (SISR) has witnessed tremendous progress in recent years owing to the deployment of deep convolutional neural networks (CNNs)
In this paper, we take a step forward to address this issue by leveraging the adaptive inference networks for deep SISR (AdaDSR)
Our AdaDSR involves an SISR model as backbone and a lightweight adapter module which takes image features and resource constraint as input and predicts a map of local network depth.
arXiv Detail & Related papers (2020-04-08T10:08:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.