No-Reference Point Cloud Quality Assessment via Domain Adaptation
- URL: http://arxiv.org/abs/2112.02851v1
- Date: Mon, 6 Dec 2021 08:20:40 GMT
- Title: No-Reference Point Cloud Quality Assessment via Domain Adaptation
- Authors: Qi Yang, Yipeng Liu, Siheng Chen, Yiling Xu, Jun Sun
- Abstract summary: We present a novel no-reference quality assessment metric, the image transferred point cloud quality assessment (IT-PCQA) for 3D point clouds.
In particular, we treat natural images as the source domain and point clouds as the target domain, and infer point cloud quality via unsupervised adversarial domain adaptation.
Experimental results show that the proposed method can achieve higher performance than traditional no-reference metrics, even comparable results with full-reference metrics.
- Score: 31.280188860021248
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a novel no-reference quality assessment metric, the image
transferred point cloud quality assessment (IT-PCQA), for 3D point clouds. For
quality assessment, deep neural network (DNN) has shown compelling performance
on no-reference metric design. However, the most challenging issue for
no-reference PCQA is that we lack large-scale subjective databases to drive
robust networks. Our motivation is that the human visual system (HVS) is the
decision-maker regardless of the type of media for quality assessment.
Leveraging the rich subjective scores of the natural images, we can quest the
evaluation criteria of human perception via DNN and transfer the capability of
prediction to 3D point clouds. In particular, we treat natural images as the
source domain and point clouds as the target domain, and infer point cloud
quality via unsupervised adversarial domain adaptation. To extract effective
latent features and minimize the domain discrepancy, we propose a hierarchical
feature encoder and a conditional-discriminative network. Considering that the
ultimate purpose is regressing objective score, we introduce a novel
conditional cross entropy loss in the conditional-discriminative network to
penalize the negative samples which hinder the convergence of the quality
regression network. Experimental results show that the proposed method can
achieve higher performance than traditional no-reference metrics, even
comparable results with full-reference metrics. The proposed method also
suggests the feasibility of assessing the quality of specific media content
without the expensive and cumbersome subjective evaluations.
Related papers
- Contrastive Pre-Training with Multi-View Fusion for No-Reference Point Cloud Quality Assessment [49.36799270585947]
No-reference point cloud quality assessment (NR-PCQA) aims to automatically evaluate the perceptual quality of distorted point clouds without available reference.
We propose a novel contrastive pre-training framework tailored for PCQA (CoPA)
Our method outperforms the state-of-the-art PCQA methods on popular benchmarks.
arXiv Detail & Related papers (2024-03-15T07:16:07Z) - Simple Baselines for Projection-based Full-reference and No-reference
Point Cloud Quality Assessment [60.2709006613171]
We propose simple baselines for projection-based point cloud quality assessment (PCQA)
We use multi-projections obtained via a common cube-like projection process from the point clouds for both full-reference (FR) and no-reference (NR) PCQA tasks.
Taking part in the ICIP 2023 PCVQA Challenge, we succeeded in achieving the top spot in four out of the five competition tracks.
arXiv Detail & Related papers (2023-10-26T04:42:57Z) - Reduced-Reference Quality Assessment of Point Clouds via
Content-Oriented Saliency Projection [17.983188216548005]
Many dense 3D point clouds have been exploited to represent visual objects instead of traditional images or videos.
We propose a novel and efficient Reduced-Reference quality metric for point clouds.
arXiv Detail & Related papers (2023-01-18T18:00:29Z) - GPA-Net:No-Reference Point Cloud Quality Assessment with Multi-task
Graph Convolutional Network [35.381247959766505]
We propose a novel no-reference PCQA metric named the Graph convolutional PCQA network (GPA-Net)
To extract effective features for PCQA, we propose a new graph convolution kernel, i.e., GPAConv, which attentively captures the perturbation of structure and texture.
Experimental results on two independent databases show that GPA-Net achieves the best performance compared to the state-of-the-art no-reference PCQA metrics.
arXiv Detail & Related papers (2022-10-29T03:06:55Z) - Blind Quality Assessment of 3D Dense Point Clouds with Structure Guided
Resampling [71.68672977990403]
We propose an objective point cloud quality index with Structure Guided Resampling (SGR) to automatically evaluate the perceptually visual quality of 3D dense point clouds.
The proposed SGR is a general-purpose blind quality assessment method without the assistance of any reference information.
arXiv Detail & Related papers (2022-08-31T02:42:55Z) - StyleAM: Perception-Oriented Unsupervised Domain Adaption for
Non-reference Image Quality Assessment [23.289183622856704]
We propose an effective perception-oriented unsupervised domain adaptation method StyleAM for NR-IQA.
StyleAM transfers sufficient knowledge from label-rich source domain data to label-free target domain images via Style Alignment and Mixup.
Experiments on two typical cross-domain settings have demonstrated the effectiveness of our proposed StyleAM on NR-IQA.
arXiv Detail & Related papers (2022-07-29T05:51:18Z) - Image Quality Assessment using Contrastive Learning [50.265638572116984]
We train a deep Convolutional Neural Network (CNN) using a contrastive pairwise objective to solve the auxiliary problem.
We show through extensive experiments that CONTRIQUE achieves competitive performance when compared to state-of-the-art NR image quality models.
Our results suggest that powerful quality representations with perceptual relevance can be obtained without requiring large labeled subjective image quality datasets.
arXiv Detail & Related papers (2021-10-25T21:01:00Z) - No-Reference Image Quality Assessment by Hallucinating Pristine Features [24.35220427707458]
We propose a no-reference (NR) image quality assessment (IQA) method via feature level pseudo-reference (PR) hallucination.
The effectiveness of our proposed method is demonstrated on four popular IQA databases.
arXiv Detail & Related papers (2021-08-09T16:48:34Z) - Unpaired Image Enhancement with Quality-Attention Generative Adversarial
Network [92.01145655155374]
We propose a quality attention generative adversarial network (QAGAN) trained on unpaired data.
Key novelty of the proposed QAGAN lies in the injected QAM for the generator.
Our proposed method achieves better performance in both objective and subjective evaluations.
arXiv Detail & Related papers (2020-12-30T05:57:20Z) - Reduced Reference Perceptual Quality Model and Application to Rate
Control for 3D Point Cloud Compression [61.110938359555895]
In rate-distortion optimization, the encoder settings are determined by maximizing a reconstruction quality measure subject to a constraint on the bit rate.
We propose a linear perceptual quality model whose variables are the V-PCC geometry and color quantization parameters.
Subjective quality tests with 400 compressed 3D point clouds show that the proposed model correlates well with the mean opinion score.
We show that for the same target bit rate, ratedistortion optimization based on the proposed model offers higher perceptual quality than rate-distortion optimization based on exhaustive search with a point-to-point objective quality metric.
arXiv Detail & Related papers (2020-11-25T12:42:02Z) - No-reference Screen Content Image Quality Assessment with Unsupervised
Domain Adaptation [37.1611601418026]
We develop the first unsupervised domain adaptation based no reference quality assessment method for SCIs.
Inspired by the transferability of pair-wise relationship, the proposed quality measure operates based on the philosophy of improving the transferability and discriminability simultaneously.
Our method can achieve higher performance on different source-target settings based on a light-weight convolution neural network.
arXiv Detail & Related papers (2020-08-19T17:31:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.