Occlusion Aware Unsupervised Learning of Optical Flow From Video
- URL: http://arxiv.org/abs/2003.01960v1
- Date: Wed, 4 Mar 2020 09:08:03 GMT
- Title: Occlusion Aware Unsupervised Learning of Optical Flow From Video
- Authors: Jianfeng Li, Junqiao Zhao, Tiantian Feng, Chen Ye, Lu Xiong
- Abstract summary: Occlusion is caused by the movement of an object or the movement of the camera.
We propose an unsupervised learning method for estimating the optical flow between video frames.
- Score: 11.505240606712501
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we proposed an unsupervised learning method for estimating the
optical flow between video frames, especially to solve the occlusion problem.
Occlusion is caused by the movement of an object or the movement of the camera,
defined as when certain pixels are visible in one video frame but not in
adjacent frames. Due to the lack of pixel correspondence between frames in the
occluded area, incorrect photometric loss calculation can mislead the optical
flow training process. In the video sequence, we found that the occlusion in
the forward ($t\rightarrow t+1$) and backward ($t\rightarrow t-1$) frame pairs
are usually complementary. That is, pixels that are occluded in subsequent
frames are often not occluded in the previous frame and vice versa. Therefore,
by using this complementarity, a new weighted loss is proposed to solve the
occlusion problem. In addition, we calculate gradients in multiple directions
to provide richer supervision information. Our method achieves competitive
optical flow accuracy compared to the baseline and some supervised methods on
KITTI 2012 and 2015 benchmarks. This source code has been released at
https://github.com/jianfenglihg/UnOpticalFlow.git.
Related papers
- OCAI: Improving Optical Flow Estimation by Occlusion and Consistency Aware Interpolation [55.676358801492114]
We propose OCAI, a method that supports robust frame ambiguities by generating intermediate video frames alongside optical flows in between.
Our evaluations demonstrate superior quality and enhanced optical flow accuracy on established benchmarks such as Sintel and KITTI.
arXiv Detail & Related papers (2024-03-26T20:23:48Z) - Dense Optical Tracking: Connecting the Dots [82.79642869586587]
DOT is a novel, simple and efficient method for solving the problem of point tracking in a video.
We show that DOT is significantly more accurate than current optical flow techniques, outperforms sophisticated "universal trackers" like OmniMotion, and is on par with, or better than, the best point tracking algorithms like CoTracker.
arXiv Detail & Related papers (2023-12-01T18:59:59Z) - Particle Videos Revisited: Tracking Through Occlusions Using Point
Trajectories [29.258861811749103]
We revisit Sand and Teller's "particle video" approach, and study pixel tracking as a long-range motion estimation problem.
We re-build this classic approach using components that drive the current state-of-the-art in flow and object tracking.
We train our models using long-range amodal point trajectories mined from existing optical flow datasets.
arXiv Detail & Related papers (2022-04-08T16:05:48Z) - Video Frame Interpolation without Temporal Priors [91.04877640089053]
Video frame aims to synthesize non-exist intermediate frames in a video sequence.
The temporal priors of videos, i.e. frames per second (FPS) and frame exposure time, may vary from different camera sensors.
We devise a novel optical flow refinement strategy for better synthesizing results.
arXiv Detail & Related papers (2021-12-02T12:13:56Z) - Dense Optical Flow from Event Cameras [55.79329250951028]
We propose to incorporate feature correlation and sequential processing into dense optical flow estimation from event cameras.
Our proposed approach computes dense optical flow and reduces the end-point error by 23% on MVSEC.
arXiv Detail & Related papers (2021-08-24T07:39:08Z) - NccFlow: Unsupervised Learning of Optical Flow With Non-occlusion from
Geometry [11.394559627312743]
This paper reveals novel geometric laws of optical flow based on the insight and detailed definition of non-occlusion.
Two loss functions are proposed for the unsupervised learning of optical flow based on the geometric laws of non-occlusion.
arXiv Detail & Related papers (2021-07-08T05:19:54Z) - Learning to Estimate Hidden Motions with Global Motion Aggregation [71.12650817490318]
Occlusions pose a significant challenge to optical flow algorithms that rely on local evidences.
We introduce a global motion aggregation module to find long-range dependencies between pixels in the first image.
We demonstrate that the optical flow estimates in the occluded regions can be significantly improved without damaging the performance in non-occluded regions.
arXiv Detail & Related papers (2021-04-06T10:32:03Z) - The Invertible U-Net for Optical-Flow-free Video Interframe Generation [31.100044730381047]
In this paper, we try to tackle the video interframe generation problem without using problematic optical flow.
We propose a learning method with a new consistency loss in the latent space to maintain semantic temporal consistency between frames.
The resolution of the generated image is guaranteed to be identical to that of the original images by using an invertible network.
arXiv Detail & Related papers (2021-03-17T11:37:10Z) - Flow-edge Guided Video Completion [66.49077223104533]
Previous flow completion methods are often unable to retain the sharpness of motion boundaries.
Our method first extracts and completes motion edges, and then uses them to guide piecewise-smooth flow completion with sharp edges.
arXiv Detail & Related papers (2020-09-03T17:59:42Z) - OccInpFlow: Occlusion-Inpainting Optical Flow Estimation by Unsupervised
Learning [29.802404790103665]
Occlusion is an inevitable and critical problem in unsupervised optical flow learning.
We present OccInpFlow, an occlusion-inpainting framework to make full use of Occlusion regions.
We conduct experiments on leading flow benchmark data sets such as Flying Chairs, KITTI and MPI-Sintel.
arXiv Detail & Related papers (2020-06-30T10:01:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.