Region-Adaptive Deformable Network for Image Quality Assessment
- URL: http://arxiv.org/abs/2104.11599v1
- Date: Fri, 23 Apr 2021 13:47:20 GMT
- Title: Region-Adaptive Deformable Network for Image Quality Assessment
- Authors: Shuwei Shi, Qingyan Bai, Mingdeng Cao, Weihao Xia, Jiahao Wang, Yifan
Chen, Yujiu Yang
- Abstract summary: In image restoration and enhancement tasks, images generated by generative adversarial networks (GAN) can achieve better visual performance than traditional CNN-generated images.
We propose the reference-oriented deformable convolution, which can improve the performance of an IQA network on GAN-based distortion.
Experiment results on the NTIRE 2021 Perceptual Image Quality Assessment Challenge dataset show the superior performance of RADN.
- Score: 16.03642709194366
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image quality assessment (IQA) aims to assess the perceptual quality of
images. The outputs of the IQA algorithms are expected to be consistent with
human subjective perception. In image restoration and enhancement tasks, images
generated by generative adversarial networks (GAN) can achieve better visual
performance than traditional CNN-generated images, although they have spatial
shift and texture noise. Unfortunately, the existing IQA methods have
unsatisfactory performance on the GAN-based distortion partially because of
their low tolerance to spatial misalignment. To this end, we propose the
reference-oriented deformable convolution, which can improve the performance of
an IQA network on GAN-based distortion by adaptively considering this
misalignment. We further propose a patch-level attention module to enhance the
interaction among different patch regions, which are processed independently in
previous patch-based methods. The modified residual block is also proposed by
applying modifications to the classic residual block to construct a
patch-region-based baseline called WResNet. Equipping this baseline with the
two proposed modules, we further propose Region-Adaptive Deformable Network
(RADN). The experiment results on the NTIRE 2021 Perceptual Image Quality
Assessment Challenge dataset show the superior performance of RADN, and the
ensemble approach won fourth place in the final testing phase of the challenge.
Code is available at https://github.com/IIGROUP/RADN.
Related papers
- Contrastive Pre-Training with Multi-View Fusion for No-Reference Point Cloud Quality Assessment [49.36799270585947]
No-reference point cloud quality assessment (NR-PCQA) aims to automatically evaluate the perceptual quality of distorted point clouds without available reference.
We propose a novel contrastive pre-training framework tailored for PCQA (CoPA)
Our method outperforms the state-of-the-art PCQA methods on popular benchmarks.
arXiv Detail & Related papers (2024-03-15T07:16:07Z) - DGNet: Dynamic Gradient-Guided Network for Water-Related Optics Image
Enhancement [77.0360085530701]
Underwater image enhancement (UIE) is a challenging task due to the complex degradation caused by underwater environments.
Previous methods often idealize the degradation process, and neglect the impact of medium noise and object motion on the distribution of image features.
Our approach utilizes predicted images to dynamically update pseudo-labels, adding a dynamic gradient to optimize the network's gradient space.
arXiv Detail & Related papers (2023-12-12T06:07:21Z) - Transformer-based No-Reference Image Quality Assessment via Supervised
Contrastive Learning [36.695247860715874]
We propose a novel Contrastive Learning (SCL) and Transformer-based NR-IQA model SaTQA.
We first train a model on a large-scale synthetic dataset by SCL to extract degradation features of images with various distortion types and levels.
To further extract distortion information from images, we propose a backbone network incorporating the Multi-Stream Block (MSB) by combining the CNN inductive bias and Transformer long-term dependence modeling capability.
Experimental results on seven standard IQA datasets show that SaTQA outperforms the state-of-the-art methods for both synthetic and authentic datasets
arXiv Detail & Related papers (2023-12-12T06:01:41Z) - Attentions Help CNNs See Better: Attention-based Hybrid Image Quality
Assessment Network [20.835800149919145]
Image quality assessment (IQA) algorithm aims to quantify the human perception of image quality.
There is a performance drop when assessing distortion images generated by generative adversarial network (GAN) with seemingly realistic texture.
We propose an Attention-based Hybrid Image Quality Assessment Network (AHIQ) to deal with the challenge and get better performance on the GAN-based IQA task.
arXiv Detail & Related papers (2022-04-22T03:59:18Z) - MANIQA: Multi-dimension Attention Network for No-Reference Image Quality
Assessment [18.637040004248796]
No-Reference Image Quality Assessment (NR-IQA) aims to assess the perceptual quality of images in accordance with human subjective perception.
Existing NR-IQA methods are far from meeting the needs of predicting accurate quality scores on GAN-based distortion images.
We propose Multi-dimension Attention Network for no-reference Image Quality Assessment (MANIQA) to improve the performance on GAN-based distortion.
arXiv Detail & Related papers (2022-04-19T15:56:43Z) - Learning Transformer Features for Image Quality Assessment [53.51379676690971]
We propose a unified IQA framework that utilizes CNN backbone and transformer encoder to extract features.
The proposed framework is compatible with both FR and NR modes and allows for a joint training scheme.
arXiv Detail & Related papers (2021-12-01T13:23:00Z) - Image Quality Assessment using Contrastive Learning [50.265638572116984]
We train a deep Convolutional Neural Network (CNN) using a contrastive pairwise objective to solve the auxiliary problem.
We show through extensive experiments that CONTRIQUE achieves competitive performance when compared to state-of-the-art NR image quality models.
Our results suggest that powerful quality representations with perceptual relevance can be obtained without requiring large labeled subjective image quality datasets.
arXiv Detail & Related papers (2021-10-25T21:01:00Z) - (ASNA) An Attention-based Siamese-Difference Neural Network with
Surrogate Ranking Loss function for Perceptual Image Quality Assessment [0.0]
Deep convolutional neural networks (DCNN) that leverage the adversarial training framework for image restoration and enhancement have significantly improved the processed images' sharpness.
It is necessary to develop a quantitative metric to reflect their performances, which is well-aligned with the perceived quality of an image.
This paper has proposed a convolutional neural network using an extension architecture of the traditional Siamese network.
arXiv Detail & Related papers (2021-05-06T09:04:21Z) - Unpaired Image Enhancement with Quality-Attention Generative Adversarial
Network [92.01145655155374]
We propose a quality attention generative adversarial network (QAGAN) trained on unpaired data.
Key novelty of the proposed QAGAN lies in the injected QAM for the generator.
Our proposed method achieves better performance in both objective and subjective evaluations.
arXiv Detail & Related papers (2020-12-30T05:57:20Z) - Towards Unsupervised Deep Image Enhancement with Generative Adversarial
Network [92.01145655155374]
We present an unsupervised image enhancement generative network (UEGAN)
It learns the corresponding image-to-image mapping from a set of images with desired characteristics in an unsupervised manner.
Results show that the proposed model effectively improves the aesthetic quality of images.
arXiv Detail & Related papers (2020-12-30T03:22:46Z) - Image Quality Assessment for Perceptual Image Restoration: A New
Dataset, Benchmark and Metric [19.855042248822738]
Image quality assessment (IQA) is the key factor for the fast development of image restoration (IR) algorithms.
Recent IR algorithms based on generative adversarial networks (GANs) have brought in significant improvement on visual performance.
We present two questions: Can existing IQA methods objectively evaluate recent IR algorithms?
arXiv Detail & Related papers (2020-11-30T17:06:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.