BVI-VFI: A Video Quality Database for Video Frame Interpolation
- URL: http://arxiv.org/abs/2210.00823v3
- Date: Sat, 21 Oct 2023 09:31:36 GMT
- Title: BVI-VFI: A Video Quality Database for Video Frame Interpolation
- Authors: Duolikun Danier, Fan Zhang, David Bull
- Abstract summary: Video frame (VFI) is a fundamental research topic in video processing.
BVI-VFI contains 540 distorted sequences generated by applying five commonly used VFI algorithms.
We benchmarked the performance of 33 classic and state-of-the-art objective image/video quality metrics on the new database.
- Score: 3.884484241124158
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Video frame interpolation (VFI) is a fundamental research topic in video
processing, which is currently attracting increased attention across the
research community. While the development of more advanced VFI algorithms has
been extensively researched, there remains little understanding of how humans
perceive the quality of interpolated content and how well existing objective
quality assessment methods perform when measuring the perceived quality. In
order to narrow this research gap, we have developed a new video quality
database named BVI-VFI, which contains 540 distorted sequences generated by
applying five commonly used VFI algorithms to 36 diverse source videos with
various spatial resolutions and frame rates. We collected more than 10,800
quality ratings for these videos through a large scale subjective study
involving 189 human subjects. Based on the collected subjective scores, we
further analysed the influence of VFI algorithms and frame rates on the
perceptual quality of interpolated videos. Moreover, we benchmarked the
performance of 33 classic and state-of-the-art objective image/video quality
metrics on the new database, and demonstrated the urgent requirement for more
accurate bespoke quality assessment methods for VFI. To facilitate further
research in this area, we have made BVI-VFI publicly available at
https://github.com/danier97/BVI-VFI-database.
Related papers
- VQA$^2$: Visual Question Answering for Video Quality Assessment [76.81110038738699]
Video Quality Assessment (VQA) is a classic field in low-level visual perception.
Recent studies in the image domain have demonstrated that Visual Question Answering (VQA) can enhance markedly low-level visual quality evaluation.
We introduce the VQA2 Instruction dataset - the first visual question answering instruction dataset that focuses on video quality assessment.
The VQA2 series models interleave visual and motion tokens to enhance the perception of spatial-temporal quality details in videos.
arXiv Detail & Related papers (2024-11-06T09:39:52Z) - AIM 2024 Challenge on Compressed Video Quality Assessment: Methods and Results [120.95863275142727]
This paper presents the results of the Compressed Video Quality Assessment challenge, held in conjunction with the Advances in Image Manipulation (AIM) workshop at ECCV 2024.
The challenge aimed to evaluate the performance of VQA methods on a diverse dataset of 459 videos encoded with 14 codecs of various compression standards.
arXiv Detail & Related papers (2024-08-21T20:32:45Z) - BVI-UGC: A Video Quality Database for User-Generated Content Transcoding [25.371693436870906]
We present a new video quality database, BVI-UGC, for user-generated content (UGC)
BVI-UGC contains 60 (non-pristine) reference videos and 1,080 test sequences.
We benchmarked the performance of 10 full-reference and 11 no-reference quality metrics.
arXiv Detail & Related papers (2024-08-13T19:30:12Z) - Benchmarking AIGC Video Quality Assessment: A Dataset and Unified Model [54.69882562863726]
We try to systemically investigate the AIGC-VQA problem from both subjective and objective quality assessment perspectives.
We evaluate the perceptual quality of AIGC videos from three dimensions: spatial quality, temporal quality, and text-to-video alignment.
We propose a Unify Generated Video Quality assessment (UGVQ) model to comprehensively and accurately evaluate the quality of AIGC videos.
arXiv Detail & Related papers (2024-07-31T07:54:26Z) - RMT-BVQA: Recurrent Memory Transformer-based Blind Video Quality Assessment for Enhanced Video Content [7.283653823423298]
We propose a novel blind deep video quality assessment (VQA) method specifically for enhanced video content.
It employs a new Recurrent Memory Transformer (RMT) based network architecture to obtain video quality representations.
The extracted quality representations are then combined through linear regression to generate video-level quality indices.
arXiv Detail & Related papers (2024-05-14T14:01:15Z) - KVQ: Kwai Video Quality Assessment for Short-form Videos [24.5291786508361]
We establish the first large-scale Kaleidoscope short Video database for Quality assessment, KVQ, which comprises 600 user-uploaded short videos and 3600 processed videos.
We propose the first short-form video quality evaluator, i.e., KSVQE, which enables the quality evaluator to identify the quality-determined semantics with the content understanding of large vision language models.
arXiv Detail & Related papers (2024-02-11T14:37:54Z) - Perceptual Video Quality Assessment: A Survey [63.61214597655413]
Perceptual video quality assessment plays a vital role in the field of video processing.
Various subjective and objective video quality assessment studies have been conducted over the past two decades.
This survey provides an up-to-date and comprehensive review of these video quality assessment studies.
arXiv Detail & Related papers (2024-02-05T16:13:52Z) - Towards Explainable In-the-Wild Video Quality Assessment: A Database and
a Language-Prompted Approach [52.07084862209754]
We collect over two million opinions on 4,543 in-the-wild videos on 13 dimensions of quality-related factors.
Specifically, we ask the subjects to label among a positive, a negative, and a neutral choice for each dimension.
These explanation-level opinions allow us to measure the relationships between specific quality factors and abstract subjective quality ratings.
arXiv Detail & Related papers (2023-05-22T05:20:23Z) - FloLPIPS: A Bespoke Video Quality Metric for Frame Interpoation [4.151439675744056]
We present a bespoke full reference video quality metric for VFI, FloLPIPS, that builds on the popular perceptual image quality metric, LPIPS.
FloLPIPS shows superior correlation performance with subjective ground truth over 12 popular quality assessors.
arXiv Detail & Related papers (2022-07-17T09:07:33Z) - A Subjective Quality Study for Video Frame Interpolation [4.151439675744056]
We describe a subjective quality study for video frame (VFI) based on a newly developed video database, BVI-VFI.
BVI-VFI contains 36 reference sequences at three different frame rates and 180 distorted videos generated using five conventional and learning based VFI algorithms.
arXiv Detail & Related papers (2022-02-15T21:13:23Z) - Subjective and Objective Quality Assessment of High Frame Rate Videos [60.970191379802095]
High frame rate (HFR) videos are becoming increasingly common with the tremendous popularity of live, high-action streaming content such as sports.
Live-YT-HFR dataset is comprised of 480 videos having 6 different frame rates, obtained from 16 diverse contents.
To obtain subjective labels on the videos, we conducted a human study yielding 19,000 human quality ratings obtained from a pool of 85 human subjects.
arXiv Detail & Related papers (2020-07-22T19:11:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.