A Unified Pyramid Recurrent Network for Video Frame Interpolation
- URL: http://arxiv.org/abs/2211.03456v2
- Date: Thu, 23 Mar 2023 04:14:45 GMT
- Title: A Unified Pyramid Recurrent Network for Video Frame Interpolation
- Authors: Xin Jin, Longhai Wu, Jie Chen, Youxin Chen, Jayoon Koo, Cheul-hee Hahm
- Abstract summary: We present UPR-Net, a novel Unified Pyramid Recurrent Network for frame synthesis.
We show that our iterative synthesis strategy can significantly improve the robustness of the frame on large motion cases.
- Score: 10.859715542859773
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Flow-guided synthesis provides a common framework for frame interpolation,
where optical flow is estimated to guide the synthesis of intermediate frames
between consecutive inputs. In this paper, we present UPR-Net, a novel Unified
Pyramid Recurrent Network for frame interpolation. Cast in a flexible pyramid
framework, UPR-Net exploits lightweight recurrent modules for both
bi-directional flow estimation and intermediate frame synthesis. At each
pyramid level, it leverages estimated bi-directional flow to generate
forward-warped representations for frame synthesis; across pyramid levels, it
enables iterative refinement for both optical flow and intermediate frame. In
particular, we show that our iterative synthesis strategy can significantly
improve the robustness of frame interpolation on large motion cases. Despite
being extremely lightweight (1.7M parameters), our base version of UPR-Net
achieves excellent performance on a large range of benchmarks. Code and trained
models of our UPR-Net series are available at:
https://github.com/srcn-ivl/UPR-Net.
Related papers
- Joint Reference Frame Synthesis and Post Filter Enhancement for Versatile Video Coding [53.703894799335735]
This paper presents the joint reference frame synthesis (RFS) and post-processing filter enhancement (PFE) for Versatile Video Coding (VVC)
Both RFS and PFE utilize the Space-Time Enhancement Network (STENet), which receives two input frames with artifacts and produces two enhanced frames with suppressed artifacts, along with an intermediate synthesized frame.
To reduce inference complexity, we propose joint inference of RFS and PFE (JISE), achieved through a single execution of STENet.
arXiv Detail & Related papers (2024-04-28T03:11:44Z) - Dynamic Frame Interpolation in Wavelet Domain [57.25341639095404]
Video frame is an important low-level computation vision task, which can increase frame rate for more fluent visual experience.
Existing methods have achieved great success by employing advanced motion models and synthesis networks.
WaveletVFI can reduce computation up to 40% while maintaining similar accuracy, making it perform more efficiently against other state-of-the-arts.
arXiv Detail & Related papers (2023-09-07T06:41:15Z) - AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation [80.33846577924363]
We present All-Pairs Multi-Field Transforms (AMT), a new network architecture for video framegithub.
It is based on two essential designs. First, we build bidirectional volumes for all pairs of pixels, and use the predicted bilateral flows to retrieve correlations.
Second, we derive multiple groups of fine-grained flow fields from one pair of updated coarse flows for performing backward warping on the input frames separately.
arXiv Detail & Related papers (2023-04-19T16:18:47Z) - Progressive Motion Context Refine Network for Efficient Video Frame
Interpolation [10.369068266836154]
Flow-based frame methods have achieved great success by first modeling optical flow between target and input frames, and then building synthesis network for target frame generation.
We propose a novel Progressive Motion Context Refine Network (PMCRNet) to predict motion fields and image context jointly for higher efficiency.
Experiments on multiple benchmarks show that proposed approaches not only achieve favorable and quantitative results but also reduces model size and running time significantly.
arXiv Detail & Related papers (2022-11-11T06:29:03Z) - Meta-Interpolation: Time-Arbitrary Frame Interpolation via Dual
Meta-Learning [65.85319901760478]
We consider processing different time-steps with adaptively generated convolutional kernels in a unified way with the help of meta-learning.
We develop a dual meta-learned frame framework to synthesize intermediate frames with the guidance of context information and optical flow.
arXiv Detail & Related papers (2022-07-27T17:36:23Z) - Enhanced Bi-directional Motion Estimation for Video Frame Interpolation [0.05541644538483946]
We present a novel yet effective algorithm for motion-based video frame estimation.
Our method achieves excellent performance on a broad range of video frame benchmarks.
arXiv Detail & Related papers (2022-06-17T06:08:43Z) - IFRNet: Intermediate Feature Refine Network for Efficient Frame
Interpolation [44.04110765492441]
We devise an efficient encoder-decoder based network, termed IFRNet, for fast intermediate frame synthesizing.
Experiments on various benchmarks demonstrate the excellent performance and fast inference speed of proposed approaches.
arXiv Detail & Related papers (2022-05-29T10:18:18Z) - Asymmetric Bilateral Motion Estimation for Video Frame Interpolation [50.44508853885882]
We propose a novel video frame algorithm based on asymmetric bilateral motion estimation (ABME)
We predict symmetric bilateral motion fields to interpolate an anchor frame.
We estimate asymmetric bilateral motions fields from the anchor frame to the input frames.
Third, we use the asymmetric fields to warp the input frames backward and reconstruct the intermediate frame.
arXiv Detail & Related papers (2021-08-15T21:11:35Z) - EA-Net: Edge-Aware Network for Flow-based Video Frame Interpolation [101.75999290175412]
We propose to reduce the image blur and get the clear shape of objects by preserving the edges in the interpolated frames.
The proposed Edge-Aware Network (EANet) integrates the edge information into the frame task.
Three edge-aware mechanisms are developed to emphasize the frame edges in estimating flow maps.
arXiv Detail & Related papers (2021-05-17T08:44:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.