No-Reference Quality Assessment for 360-degree Images by Analysis of
Multi-frequency Information and Local-global Naturalness
- URL: http://arxiv.org/abs/2102.11393v1
- Date: Mon, 22 Feb 2021 22:52:35 GMT
- Title: No-Reference Quality Assessment for 360-degree Images by Analysis of
Multi-frequency Information and Local-global Naturalness
- Authors: Wei Zhou, Jiahua Xu, Qiuping Jiang, Zhibo Chen
- Abstract summary: 360-degree/omnidirectional images (OIs) have achieved remarkable attentions due to the increasing applications of virtual reality (VR)
We propose a novel and effective no-reference omnidirectional image quality assessment (NR OIQA) algorithm by Multi-Frequency Information and Local-Global Naturalness (MFILGN)
Experimental results on two publicly available OIQA databases demonstrate that our proposed MFILGN outperforms state-of-the-art approaches.
- Score: 26.614657212889398
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: 360-degree/omnidirectional images (OIs) have achieved remarkable attentions
due to the increasing applications of virtual reality (VR). Compared to
conventional 2D images, OIs can provide more immersive experience to consumers,
benefitting from the higher resolution and plentiful field of views (FoVs).
Moreover, observing OIs is usually in the head mounted display (HMD) without
references. Therefore, an efficient blind quality assessment method, which is
specifically designed for 360-degree images, is urgently desired. In this
paper, motivated by the characteristics of the human visual system (HVS) and
the viewing process of VR visual contents, we propose a novel and effective
no-reference omnidirectional image quality assessment (NR OIQA) algorithm by
Multi-Frequency Information and Local-Global Naturalness (MFILGN).
Specifically, inspired by the frequency-dependent property of visual cortex, we
first decompose the projected equirectangular projection (ERP) maps into
wavelet subbands. Then, the entropy intensities of low and high frequency
subbands are exploited to measure the multi-frequency information of OIs.
Besides, except for considering the global naturalness of ERP maps, owing to
the browsed FoVs, we extract the natural scene statistics features from each
viewport image as the measure of local naturalness. With the proposed
multi-frequency information measurement and local-global naturalness
measurement, we utilize support vector regression as the final image quality
regressor to train the quality evaluation model from visual quality-related
features to human ratings. To our knowledge, the proposed model is the first
no-reference quality assessment method for 360-degreee images that combines
multi-frequency information and image naturalness. Experimental results on two
publicly available OIQA databases demonstrate that our proposed MFILGN
outperforms state-of-the-art approaches.
Related papers
- Perceptual Depth Quality Assessment of Stereoscopic Omnidirectional Images [10.382801621282228]
We develop an objective quality assessment model named depth quality index (DQI) for efficient no-reference (NR) depth quality assessment of stereoscopic omnidirectional images.
Motivated by the perceptual characteristics of the human visual system (HVS), the proposed DQI is built upon multi-color-channel, adaptive viewport selection, and interocular discrepancy features.
arXiv Detail & Related papers (2024-08-19T16:28:05Z) - Non-Reference Quality Assessment for Medical Imaging: Application to Synthetic Brain MRIs [0.0]
This study introduces a novel deep learning-based non-reference approach to assess brain MRI quality by training a 3D ResNet.
The network is designed to estimate quality across six distinct artifacts commonly encountered in MRI scans.
Results demonstrate superior performance in accurately estimating distortions and reflecting image quality from multiple perspectives.
arXiv Detail & Related papers (2024-07-20T22:05:30Z) - Opinion-Unaware Blind Image Quality Assessment using Multi-Scale Deep Feature Statistics [54.08757792080732]
We propose integrating deep features from pre-trained visual models with a statistical analysis model to achieve opinion-unaware BIQA (OU-BIQA)
Our proposed model exhibits superior consistency with human visual perception compared to state-of-the-art BIQA models.
arXiv Detail & Related papers (2024-05-29T06:09:34Z) - Diffusion Model Based Visual Compensation Guidance and Visual Difference
Analysis for No-Reference Image Quality Assessment [82.13830107682232]
We propose a novel class of state-of-the-art (SOTA) generative model, which exhibits the capability to model intricate relationships.
We devise a new diffusion restoration network that leverages the produced enhanced image and noise-containing images.
Two visual evaluation branches are designed to comprehensively analyze the obtained high-level feature information.
arXiv Detail & Related papers (2024-02-22T09:39:46Z) - Assessor360: Multi-sequence Network for Blind Omnidirectional Image
Quality Assessment [50.82681686110528]
Blind Omnidirectional Image Quality Assessment (BOIQA) aims to objectively assess the human perceptual quality of omnidirectional images (ODIs)
The quality assessment of ODIs is severely hampered by the fact that the existing BOIQA pipeline lacks the modeling of the observer's browsing process.
We propose a novel multi-sequence network for BOIQA called Assessor360, which is derived from the realistic multi-assessor ODI quality assessment procedure.
arXiv Detail & Related papers (2023-05-18T13:55:28Z) - ST360IQ: No-Reference Omnidirectional Image Quality Assessment with
Spherical Vision Transformers [17.48330099000856]
We present a method for no-reference 360 image quality assessment.
Our approach predicts the quality of an omnidirectional image correlated with the human-perceived image quality.
arXiv Detail & Related papers (2023-03-13T07:48:46Z) - High Dynamic Range Image Quality Assessment Based on Frequency Disparity [78.36555631446448]
An image quality assessment (IQA) algorithm based on frequency disparity for high dynamic range ( HDR) images is proposed.
The proposed LGFM can provide a higher consistency with the subjective perception compared with the state-of-the-art HDR IQA methods.
arXiv Detail & Related papers (2022-09-06T08:22:13Z) - Blind Quality Assessment of 3D Dense Point Clouds with Structure Guided
Resampling [71.68672977990403]
We propose an objective point cloud quality index with Structure Guided Resampling (SGR) to automatically evaluate the perceptually visual quality of 3D dense point clouds.
The proposed SGR is a general-purpose blind quality assessment method without the assistance of any reference information.
arXiv Detail & Related papers (2022-08-31T02:42:55Z) - Perceptual Quality Assessment of Omnidirectional Images as Moving Camera
Videos [49.217528156417906]
Two types of VR viewing conditions are crucial in determining the viewing behaviors of users and the perceived quality of the panorama.
We first transform an omnidirectional image to several video representations using different user viewing behaviors under different viewing conditions.
We then leverage advanced 2D full-reference video quality models to compute the perceived quality.
arXiv Detail & Related papers (2020-05-21T10:03:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.