No-Reference Quality Assessment for Colored Point Cloud and Mesh Based
on Natural Scene Statistics
- URL: http://arxiv.org/abs/2107.02041v2
- Date: Thu, 8 Jul 2021 05:21:16 GMT
- Title: No-Reference Quality Assessment for Colored Point Cloud and Mesh Based
on Natural Scene Statistics
- Authors: Zicheng Zhang, Wei sun, Wei Sun, Xiongkuo Min, Tao Wang, Wei Lu, and
Guangtao Zhai
- Abstract summary: We propose an NSS-based no-reference quality assessment metric for colored 3D models.
Our method is mainly validated on the colored point cloud quality assessment database (SJTU-PCQA) and the colored mesh quality assessment database (CMDM)
- Score: 36.017914479449864
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To improve the viewer's quality of experience and optimize processing systems
in computer graphics applications, the 3D quality assessment (3D-QA) has become
an important task in the multimedia area. Point cloud and mesh are the two most
widely used electronic representation formats of 3D models, the quality of
which is quite sensitive to operations like simplification and compression.
Therefore, many studies concerning point cloud quality assessment (PCQA) and
mesh quality assessment (MQA) have been carried out to measure the visual
quality degradations caused by lossy operations. However, a large part of
previous studies utilizes full-reference (FR) metrics, which means they may
fail to predict the accurate quality level of 3D models when the reference 3D
model is not available. Furthermore, limited numbers of 3D-QA metrics are
carried out to take color features into consideration, which significantly
restricts the effectiveness and scope of application. In many quality
assessment studies, natural scene statistics (NSS) have shown a good ability to
quantify the distortion of natural scenes to statistical parameters. Therefore,
we propose an NSS-based no-reference quality assessment metric for colored 3D
models. In this paper, quality-aware features are extracted from the aspects of
color and geometry directly from the 3D models. Then the statistic parameters
are estimated using different distribution models to describe the
characteristic of the 3D models. Our method is mainly validated on the colored
point cloud quality assessment database (SJTU-PCQA) and the colored mesh
quality assessment database (CMDM). The experimental results show that the
proposed method outperforms all the state-of-art NR 3D-QA metrics and obtains
an acceptable gap with the state-of-art FR 3D-QA metrics.
Related papers
- Activating Frequency and ViT for 3D Point Cloud Quality Assessment
without Reference [0.49157446832511503]
We propose no-reference quality metric of a given 3D-PC.
To map the input attributes to quality score, we use a light-weight hybrid deep model; combined of Deformable Convolutional Network (DCN) and Vision Transformers (ViT)
The results show that our approach outperforms state-of-the-art NR-PCQA measures and even some FR-PCQA on PointXR.
arXiv Detail & Related papers (2023-12-10T19:13:34Z) - Geometry-Aware Video Quality Assessment for Dynamic Digital Human [56.17852258306602]
We propose a novel no-reference (NR) geometry-aware video quality assessment method for DDH-QA challenge.
The proposed method achieves state-of-the-art performance on the DDH-QA database.
arXiv Detail & Related papers (2023-10-24T16:34:03Z) - GMS-3DQA: Projection-based Grid Mini-patch Sampling for 3D Model Quality
Assessment [82.93561866101604]
Previous projection-based 3DQA methods directly extract features from multi-projections to ensure quality prediction accuracy.
We propose a no-reference (NR) projection-based textitunderlineGrid underlineMini-patch underlineSampling underline3D Model underlineQuality underlineAssessment (GMS-3DQA) method.
The proposed GMS-3DQA requires far less computational resources and inference time than other 3D
arXiv Detail & Related papers (2023-06-09T03:53:12Z) - EEP-3DQA: Efficient and Effective Projection-based 3D Model Quality
Assessment [58.16279881415622]
It is difficult to perform an efficient module to extract quality-aware features of 3D models.
We develop a no-reference (NR) underlineEfficient and underlineEffective underlineProjection-based underline3D Model underlineQuality underlineAssessment (textbfEEP-3DQA) method.
The proposed EEP-3DQA and EEP-3DQA-t (tiny version) achieve
arXiv Detail & Related papers (2023-02-17T06:14:37Z) - Blind Quality Assessment of 3D Dense Point Clouds with Structure Guided
Resampling [71.68672977990403]
We propose an objective point cloud quality index with Structure Guided Resampling (SGR) to automatically evaluate the perceptually visual quality of 3D dense point clouds.
The proposed SGR is a general-purpose blind quality assessment method without the assistance of any reference information.
arXiv Detail & Related papers (2022-08-31T02:42:55Z) - Subjective and Objective Visual Quality Assessment of Textured 3D Meshes [3.738515725866836]
We present a new subjective study to evaluate the perceptual quality of textured meshes, based on a paired comparison protocol.
We propose two new metrics for visual quality assessment of textured mesh, as optimized linear combinations of accurate geometry and texture quality measurements.
arXiv Detail & Related papers (2021-02-08T03:26:41Z) - Reduced Reference Perceptual Quality Model and Application to Rate
Control for 3D Point Cloud Compression [61.110938359555895]
In rate-distortion optimization, the encoder settings are determined by maximizing a reconstruction quality measure subject to a constraint on the bit rate.
We propose a linear perceptual quality model whose variables are the V-PCC geometry and color quantization parameters.
Subjective quality tests with 400 compressed 3D point clouds show that the proposed model correlates well with the mean opinion score.
We show that for the same target bit rate, ratedistortion optimization based on the proposed model offers higher perceptual quality than rate-distortion optimization based on exhaustive search with a point-to-point objective quality metric.
arXiv Detail & Related papers (2020-11-25T12:42:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.