BAND-2k: Banding Artifact Noticeable Database for Banding Detection and
Quality Assessment
- URL: http://arxiv.org/abs/2311.17752v1
- Date: Wed, 29 Nov 2023 15:56:31 GMT
- Title: BAND-2k: Banding Artifact Noticeable Database for Banding Detection and
Quality Assessment
- Authors: Zijian Chen, Wei Sun, Jun Jia, Fangfang Lu, Zicheng Zhang, Jing Liu,
Ru Huang, Xiongkuo Min, Guangtao Zhai
- Abstract summary: Banding, also known as staircase-like contours, frequently occurs in flat areas of images/videos processed by the compression or quantization algorithms.
We build the largest banding IQA database so far, named Banding Artifact Noticeable Database (BAND-2k), which consists of 2,000 banding images.
A dual convolutional neural network is employed to concurrently learn the feature representation from the high-frequency and low-frequency maps.
- Score: 52.1640725073183
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Banding, also known as staircase-like contours, frequently occurs in flat
areas of images/videos processed by the compression or quantization algorithms.
As undesirable artifacts, banding destroys the original image structure, thus
degrading users' quality of experience (QoE). In this paper, we systematically
investigate the banding image quality assessment (IQA) problem, aiming to
detect the image banding artifacts and evaluate their perceptual visual
quality. Considering that the existing image banding databases only contain
limited content sources and banding generation methods, and lack perceptual
quality labels (i.e. mean opinion scores), we first build the largest banding
IQA database so far, named Banding Artifact Noticeable Database (BAND-2k),
which consists of 2,000 banding images generated by 15 compression and
quantization schemes. A total of 23 workers participated in the subjective IQA
experiment, yielding over 214,000 patch-level banding class labels and 44,371
reliable image-level quality ratings. Subsequently, we develop an effective
no-reference (NR) banding evaluator for banding detection and quality
assessment by leveraging frequency characteristics of banding artifacts. A dual
convolutional neural network is employed to concurrently learn the feature
representation from the high-frequency and low-frequency maps, thereby
enhancing the ability to discern banding artifacts. The quality score of a
banding image is generated by pooling the banding detection maps masked by the
spatial frequency filters. Experiments demonstrate that our banding evaluator
achieves a remarkably high accuracy in banding detection and also exhibits high
SRCC and PLCC results with the perceptual quality labels. These findings unveil
the strong correlations between the intensity of banding artifacts and the
perceptual visual quality, thus validating the necessity of banding quality
assessment.
Related papers
- Fine-grained subjective visual quality assessment for high-fidelity compressed images [4.787528476079247]
The JPEG standardization project AIC is developing a subjective image quality assessment methodology for high-fidelity images.
This paper presents the proposed assessment methods, a dataset of high-quality compressed images, and their corresponding crowdsourced visual quality ratings.
It also outlines a data analysis approach that reconstructs quality scale values in just noticeable difference (JND) units.
arXiv Detail & Related papers (2024-10-12T11:37:19Z) - FS-BAND: A Frequency-Sensitive Banding Detector [55.59101150019851]
Banding artifact, as known as staircase-like contour, is a common quality annoyance that happens in compression, transmission, etc.
We propose a no-reference banding detection model to capture and evaluate banding artifacts, called the Frequency-Sensitive BANding Detector (FS-BAND)
Experimental results show that the proposed FS-BAND method outperforms state-of-the-art image quality assessment (IQA) approaches with higher accuracy in banding classification task.
arXiv Detail & Related papers (2023-11-30T03:20:42Z) - Blind Image Quality Assessment Using Multi-Stream Architecture with Spatial and Channel Attention [4.983104446206061]
BIQA (Blind Image Quality Assessment) is an important field of study that evaluates images automatically.
Most algorithms generate quality without emphasizing the important region of interest.
A multi-stream spatial and channel attention-based algorithm is being proposed to solve this problem.
arXiv Detail & Related papers (2023-07-19T09:36:08Z) - Expert-Agnostic Ultrasound Image Quality Assessment using Deep
Variational Clustering [0.03262230127283451]
Ultrasound images are low in quality and suffer from noisy annotations caused by inter-observer variations.
We propose an UnSupervised UltraSound image Quality assessment Network, US2QNet, that eliminates the burden and uncertainty of manual annotations.
The proposed framework achieved 78% accuracy and superior performance to state-of-the-art clustering methods.
arXiv Detail & Related papers (2023-07-05T17:34:58Z) - Explainable Image Quality Assessment for Medical Imaging [0.0]
Poor-quality medical images may lead to misdiagnosis.
We propose an explainable image quality assessment system and validate our idea on two different objectives.
We apply a variety of techniques to measure the faithfulness of the saliency detectors.
We show that NormGrad has significant gains over other saliency detectors by reaching a repeated Pointing Game score of 0.853 for Object-CXR and 0.611 for LVOT datasets.
arXiv Detail & Related papers (2023-03-25T14:18:39Z) - A Comparative Study of Fingerprint Image-Quality Estimation Methods [54.84936551037727]
Poor-quality images result in spurious and missing features, thus degrading the performance of the overall system.
In this work, we review existing approaches for fingerprint image-quality estimation.
We have also tested a selection of fingerprint image-quality estimation algorithms.
arXiv Detail & Related papers (2021-11-14T19:53:12Z) - Image Quality Assessment using Contrastive Learning [50.265638572116984]
We train a deep Convolutional Neural Network (CNN) using a contrastive pairwise objective to solve the auxiliary problem.
We show through extensive experiments that CONTRIQUE achieves competitive performance when compared to state-of-the-art NR image quality models.
Our results suggest that powerful quality representations with perceptual relevance can be obtained without requiring large labeled subjective image quality datasets.
arXiv Detail & Related papers (2021-10-25T21:01:00Z) - Improving Medical Image Classification with Label Noise Using
Dual-uncertainty Estimation [72.0276067144762]
We discuss and define the two common types of label noise in medical images.
We propose an uncertainty estimation-based framework to handle these two label noise amid the medical image classification task.
arXiv Detail & Related papers (2021-02-28T14:56:45Z) - BBAND Index: A No-Reference Banding Artifact Predictor [55.42929350861115]
Banding artifact, or false contouring, is a common video compression impairment.
We propose a new distortion-specific no-reference video quality model for predicting banding artifacts, called the Blind BANding Detector (BBAND index)
arXiv Detail & Related papers (2020-02-27T03:05:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.