A Brief Survey on Adaptive Video Streaming Quality Assessment
- URL: http://arxiv.org/abs/2202.12987v1
- Date: Fri, 25 Feb 2022 21:38:14 GMT
- Title: A Brief Survey on Adaptive Video Streaming Quality Assessment
- Authors: Wei Zhou, Xiongkuo Min, Hong Li, Qiuping Jiang
- Abstract summary: Quality of experience (QoE) assessment for adaptive video streaming plays a significant role in advanced network management systems.
We analyze and compare different variations of objective QoE assessment models with or without using machine learning techniques for adaptive video streaming.
We find that existing video streaming QoE assessment models still have limited performance, which makes it difficult to be applied in practical communication systems.
- Score: 30.253712568568876
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quality of experience (QoE) assessment for adaptive video streaming plays a
significant role in advanced network management systems. It is especially
challenging in case of dynamic adaptive streaming schemes over HTTP (DASH)
which has increasingly complex characteristics including additional playback
issues. In this paper, we provide a brief overview of adaptive video streaming
quality assessment. Upon our review of related works, we analyze and compare
different variations of objective QoE assessment models with or without using
machine learning techniques for adaptive video streaming. Through the
performance analysis, we observe that hybrid models perform better than both
quality-of-service (QoS) driven QoE approaches and signal fidelity measurement.
Moreover, the machine learning-based model slightly outperforms the model
without using machine learning for the same setting. In addition, we find that
existing video streaming QoE assessment models still have limited performance,
which makes it difficult to be applied in practical communication systems.
Therefore, based on the success of deep learned feature representations for
traditional video quality prediction, we also apply the off-the-shelf deep
convolutional neural network (DCNN) to evaluate the perceptual quality of
streaming videos, where the spatio-temporal properties of streaming videos are
taken into consideration. Experiments demonstrate its superiority, which sheds
light on the future development of specifically designed deep learning
frameworks for adaptive video streaming quality assessment. We believe this
survey can serve as a guideline for QoE assessment of adaptive video streaming.
Related papers
- Satellite Streaming Video QoE Prediction: A Real-World Subjective Database and Network-Level Prediction Models [59.061552498630874]
We introduce the LIVE-Viasat Real-World Satellite QoE Database.
This database consists of 179 videos recorded from real-world streaming services affected by various authentic distortion patterns.
We demonstrate the usefulness of this unique new resource by evaluating the efficacy of QoE-prediction models on it.
We also created a new model that maps the network parameters to predicted human perception scores, which can be used by ISPs to optimize the video streaming quality of their networks.
arXiv Detail & Related papers (2024-10-17T18:22:50Z) - Subjective and Objective Quality-of-Experience Evaluation Study for Live Video Streaming [51.712182539961375]
We conduct a comprehensive study of subjective and objective QoE evaluations for live video streaming.
For the subjective QoE study, we introduce the first live video streaming QoE dataset, TaoLive QoE.
A human study was conducted to derive subjective QoE scores of videos in the TaoLive QoE dataset.
We propose an end-to-end QoE evaluation model, Tao-QoE, which integrates multi-scale semantic features and optical flow-based motion features.
arXiv Detail & Related papers (2024-09-26T07:22:38Z) - Benchmarking AIGC Video Quality Assessment: A Dataset and Unified Model [54.69882562863726]
We try to systemically investigate the AIGC-VQA problem from both subjective and objective quality assessment perspectives.
We evaluate the perceptual quality of AIGC videos from three dimensions: spatial quality, temporal quality, and text-to-video alignment.
We propose a Unify Generated Video Quality assessment (UGVQ) model to comprehensively and accurately evaluate the quality of AIGC videos.
arXiv Detail & Related papers (2024-07-31T07:54:26Z) - Modular Blind Video Quality Assessment [33.657933680973194]
Blind video quality assessment (BVQA) plays a pivotal role in evaluating and improving the viewing experience of end-users across a wide range of video-based platforms and services.
In this paper, we propose a modular BVQA model and a method of training it to improve its modularity.
arXiv Detail & Related papers (2024-02-29T15:44:00Z) - CONVIQT: Contrastive Video Quality Estimator [63.749184706461826]
Perceptual video quality assessment (VQA) is an integral component of many streaming and video sharing platforms.
Here we consider the problem of learning perceptually relevant video quality representations in a self-supervised manner.
Our results indicate that compelling representations with perceptual bearing can be obtained using self-supervised learning.
arXiv Detail & Related papers (2022-06-29T15:22:01Z) - RAPIQUE: Rapid and Accurate Video Quality Prediction of User Generated
Content [44.03188436272383]
We introduce an effective and efficient video quality model for content, which we dub the Rapid and Accurate Video Quality Evaluator (RAPIQUE)
RAPIQUE combines and leverages the advantages of both quality-aware scene statistics features and semantics-aware deep convolutional features.
Our experimental results on recent large-scale video quality databases show that RAPIQUE delivers top performances on all the datasets at a considerably lower computational expense.
arXiv Detail & Related papers (2021-01-26T17:23:46Z) - Study on the Assessment of the Quality of Experience of Streaming Video [117.44028458220427]
In this paper, the influence of various objective factors on the subjective estimation of the QoE of streaming video is studied.
The paper presents standard and handcrafted features, shows their correlation and p-Value of significance.
We take SQoE-III database, so far the largest and most realistic of its kind.
arXiv Detail & Related papers (2020-12-08T18:46:09Z) - UGC-VQA: Benchmarking Blind Video Quality Assessment for User Generated
Content [59.13821614689478]
Blind quality prediction of in-the-wild videos is quite challenging, since the quality degradations of content are unpredictable, complicated, and often commingled.
Here we contribute to advancing the problem by conducting a comprehensive evaluation of leading VQA models.
By employing a feature selection strategy on top of leading VQA model features, we are able to extract 60 of the 763 statistical features used by the leading models.
Our experimental results show that VIDEVAL achieves state-of-theart performance at considerably lower computational cost than other leading models.
arXiv Detail & Related papers (2020-05-29T00:39:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.