EDCFlow: Exploring Temporally Dense Difference Maps for Event-based Optical Flow Estimation
- URL: http://arxiv.org/abs/2506.03512v1
- Date: Wed, 04 Jun 2025 02:55:04 GMT
- Title: EDCFlow: Exploring Temporally Dense Difference Maps for Event-based Optical Flow Estimation
- Authors: Daikun Liu, Lei Cheng, Teng Wang, changyin Sun,
- Abstract summary: We present a lightweight event-based optical flow network (EDCFlow) to achieve high-quality flow estimation at a higher resolution.<n>EDCFlow achieves better performance with lower complexity compared to existing methods, offering superior generalization.
- Score: 17.388776062997813
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent learning-based methods for event-based optical flow estimation utilize cost volumes for pixel matching but suffer from redundant computations and limited scalability to higher resolutions for flow refinement. In this work, we take advantage of the complementarity between temporally dense feature differences of adjacent event frames and cost volume and present a lightweight event-based optical flow network (EDCFlow) to achieve high-quality flow estimation at a higher resolution. Specifically, an attention-based multi-scale temporal feature difference layer is developed to capture diverse motion patterns at high resolution in a computation-efficient manner. An adaptive fusion of high-resolution difference motion features and low-resolution correlation motion features is performed to enhance motion representation and model generalization. Notably, EDCFlow can serve as a plug-and-play refinement module for RAFT-like event-based methods to enhance flow details. Extensive experiments demonstrate that EDCFlow achieves better performance with lower complexity compared to existing methods, offering superior generalization.
Related papers
- HiFlow: Training-free High-Resolution Image Generation with Flow-Aligned Guidance [70.69373563281324]
HiFlow is a training-free and model-agnostic framework to unlock the resolution potential of pre-trained flow models.<n>HiFlow substantially elevates the quality of high-resolution image synthesis of T2I models.
arXiv Detail & Related papers (2025-04-08T17:30:40Z) - FlowIE: Efficient Image Enhancement via Rectified Flow [71.6345505427213]
FlowIE is a flow-based framework that estimates straight-line paths from an elementary distribution to high-quality images.
Our contributions are rigorously validated through comprehensive experiments on synthetic and real-world datasets.
arXiv Detail & Related papers (2024-06-01T17:29:29Z) - StreamFlow: Streamlined Multi-Frame Optical Flow Estimation for Video
Sequences [31.210626775505407]
Occlusions between consecutive frames have long posed a significant challenge in optical flow estimation.
We present a Streamlined In-batch Multi-frame (SIM) pipeline tailored to video input, attaining a similar level of time efficiency to two-frame networks.
StreamFlow not only excels in terms of performance on challenging KITTI and Sintel datasets, with particular improvement in occluded areas.
arXiv Detail & Related papers (2023-11-28T07:53:51Z) - AccFlow: Backward Accumulation for Long-Range Optical Flow [70.4251045372285]
This paper proposes a novel recurrent framework called AccFlow for long-range optical flow estimation.
We demonstrate the superiority of backward accumulation over conventional forward accumulation.
Experiments validate the effectiveness of AccFlow in handling long-range optical flow estimation.
arXiv Detail & Related papers (2023-08-25T01:51:26Z) - AnyFlow: Arbitrary Scale Optical Flow with Implicit Neural
Representation [17.501820140334328]
We introduce AnyFlow, a robust network that estimates accurate flow from images of various resolutions.
We establish a new state-of-the-art performance of cross-dataset generalization on the KITTI dataset.
arXiv Detail & Related papers (2023-03-29T07:03:51Z) - GMFlow: Learning Optical Flow via Global Matching [124.57850500778277]
We propose a GMFlow framework for learning optical flow estimation.
It consists of three main components: a customized Transformer for feature enhancement, a correlation and softmax layer for global feature matching, and a self-attention layer for flow propagation.
Our new framework outperforms 32-iteration RAFT's performance on the challenging Sintel benchmark.
arXiv Detail & Related papers (2021-11-26T18:59:56Z) - Dense Optical Flow from Event Cameras [55.79329250951028]
We propose to incorporate feature correlation and sequential processing into dense optical flow estimation from event cameras.
Our proposed approach computes dense optical flow and reduces the end-point error by 23% on MVSEC.
arXiv Detail & Related papers (2021-08-24T07:39:08Z) - Unsupervised Motion Representation Enhanced Network for Action
Recognition [4.42249337449125]
Motion representation between consecutive frames has proven to have great promotion to video understanding.
TV-L1 method, an effective optical flow solver, is time-consuming and expensive in storage for caching the extracted optical flow.
We propose UF-TSN, a novel end-to-end action recognition approach enhanced with an embedded lightweight unsupervised optical flow estimator.
arXiv Detail & Related papers (2021-03-05T04:14:32Z) - STaRFlow: A SpatioTemporal Recurrent Cell for Lightweight Multi-Frame
Optical Flow Estimation [64.99259320624148]
We present a new lightweight CNN-based algorithm for multi-frame optical flow estimation.
The resulting STaRFlow algorithm gives state-of-the-art performances on MPI Sintel and Kitti2015.
arXiv Detail & Related papers (2020-07-10T17:01:34Z) - Normalizing Flows with Multi-Scale Autoregressive Priors [131.895570212956]
We introduce channel-wise dependencies in their latent space through multi-scale autoregressive priors (mAR)
Our mAR prior for models with split coupling flow layers (mAR-SCF) can better capture dependencies in complex multimodal data.
We show that mAR-SCF allows for improved image generation quality, with gains in FID and Inception scores compared to state-of-the-art flow-based models.
arXiv Detail & Related papers (2020-04-08T09:07:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.