Perceptual Quality Assessment of Omnidirectional Images
- URL: http://arxiv.org/abs/2207.02674v1
- Date: Wed, 6 Jul 2022 13:40:38 GMT
- Title: Perceptual Quality Assessment of Omnidirectional Images
- Authors: Huiyu Duan, Guangtao Zhai, Xiongkuo Min, Yucheng Zhu, Yi Fang,
Xiaokang Yang
- Abstract summary: We first establish an omnidirectional IQA (OIQA) database, which includes 16 source images and 320 distorted images degraded by 4 commonly encountered distortion types.
Then a subjective quality evaluation study is conducted on the OIQA database in the VR environment.
The original and distorted omnidirectional images, subjective quality ratings, and the head and eye movement data together constitute the OIQA database.
- Score: 81.76416696753947
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Omnidirectional images and videos can provide immersive experience of
real-world scenes in Virtual Reality (VR) environment. We present a perceptual
omnidirectional image quality assessment (IQA) study in this paper since it is
extremely important to provide a good quality of experience under the VR
environment. We first establish an omnidirectional IQA (OIQA) database, which
includes 16 source images and 320 distorted images degraded by 4 commonly
encountered distortion types, namely JPEG compression, JPEG2000 compression,
Gaussian blur and Gaussian noise. Then a subjective quality evaluation study is
conducted on the OIQA database in the VR environment. Considering that humans
can only see a part of the scene at one movement in the VR environment, visual
attention becomes extremely important. Thus we also track head and eye movement
data during the quality rating experiments. The original and distorted
omnidirectional images, subjective quality ratings, and the head and eye
movement data together constitute the OIQA database. State-of-the-art
full-reference (FR) IQA measures are tested on the OIQA database, and some new
observations different from traditional IQA are made.
Related papers
- ESIQA: Perceptual Quality Assessment of Vision-Pro-based Egocentric Spatial Images [70.68629648595677]
Egocentric spatial images and videos are emerging as a compelling form of stereoscopic XR content.
Egocentric Spatial Images Quality Assessment Database (ESIQAD) is first IQA database dedicated for egocentric spatial images.
ESIQAD includes 500 egocentric spatial images, 400 images captured with the Apple Vision Pro and 100 images generated via an iPhone's "Spatial Camera" app.
arXiv Detail & Related papers (2024-07-31T06:20:21Z) - AIGCOIQA2024: Perceptual Quality Assessment of AI Generated Omnidirectional Images [70.42666704072964]
We establish a large-scale AI generated omnidirectional image IQA database named AIGCOIQA2024.
A subjective IQA experiment is conducted to assess human visual preferences from three perspectives.
We conduct a benchmark experiment to evaluate the performance of state-of-the-art IQA models on our database.
arXiv Detail & Related papers (2024-04-01T10:08:23Z) - Subjective and Objective Quality Assessment for in-the-Wild Computer
Graphics Images [57.02760260360728]
We build a large-scale in-the-wild CGIQA database consisting of 6,000 CGIs (CGIQA-6k)
We propose an effective deep learning-based no-reference (NR) IQA model by utilizing both distortion and aesthetic quality representation.
Experimental results show that the proposed method outperforms all other state-of-the-art NR IQA methods on the constructed CGIQA-6k database.
arXiv Detail & Related papers (2023-03-14T16:32:24Z) - ST360IQ: No-Reference Omnidirectional Image Quality Assessment with
Spherical Vision Transformers [17.48330099000856]
We present a method for no-reference 360 image quality assessment.
Our approach predicts the quality of an omnidirectional image correlated with the human-perceived image quality.
arXiv Detail & Related papers (2023-03-13T07:48:46Z) - Perceptual Quality Assessment of Virtual Reality Videos in the Wild [53.94620993606658]
Existing panoramic video databases only consider synthetic distortions, assume fixed viewing conditions, and are limited in size.
We construct the VR Video Quality in the Wild (VRVQW) database, containing $502$ user-generated videos with diverse content and distortion characteristics.
We conduct a formal psychophysical experiment to record the scanpaths and perceived quality scores from $139$ participants under two different viewing conditions.
arXiv Detail & Related papers (2022-06-13T02:22:57Z) - Confusing Image Quality Assessment: Towards Better Augmented Reality
Experience [96.29124666702566]
We consider AR technology as the superimposition of virtual scenes and real scenes, and introduce visual confusion as its basic theory.
A ConFusing Image Quality Assessment (CFIQA) database is established, which includes 600 reference images and 300 distorted images generated by mixing reference images in pairs.
An objective metric termed CFIQA is also proposed to better evaluate the confusing image quality.
arXiv Detail & Related papers (2022-04-11T07:03:06Z) - Cuid: A new study of perceived image quality and its subjective
assessment [30.698984450985318]
We present a new study of image quality perception where subjective ratings were collected in a controlled lab environment.
We investigate how quality perception is affected by a combination of different categories of images and different types and levels of distortions.
arXiv Detail & Related papers (2020-09-28T13:14:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.