Hybrid Cost Volume for Memory-Efficient Optical Flow
- URL: http://arxiv.org/abs/2409.04243v1
- Date: Fri, 6 Sep 2024 12:49:34 GMT
- Title: Hybrid Cost Volume for Memory-Efficient Optical Flow
- Authors: Yang Zhao, Gangwei Xu, Gang Wu,
- Abstract summary: Current state-of-the-art flow methods are mostly based on dense all-pairs cost volumes.
We propose a novel Hybrid Cost Volume for memory-efficient optical flow, named HCV.
Based on HCV, we design a memory-efficient optical flow network, named HCVFlow.
- Score: 10.760762249786344
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current state-of-the-art flow methods are mostly based on dense all-pairs cost volumes. However, as image resolution increases, the computational and spatial complexity of constructing these cost volumes grows at a quartic rate, making these methods impractical for high-resolution images. In this paper, we propose a novel Hybrid Cost Volume for memory-efficient optical flow, named HCV. To construct HCV, we first propose a Top-k strategy to separate the 4D cost volume into two global 3D cost volumes. These volumes significantly reduce memory usage while retaining a substantial amount of matching information. We further introduce a local 4D cost volume with a local search space to supplement the local information for HCV. Based on HCV, we design a memory-efficient optical flow network, named HCVFlow. Compared to the recurrent flow methods based the all-pairs cost volumes, our HCVFlow significantly reduces memory consumption while ensuring high accuracy. We validate the effectiveness and efficiency of our method on the Sintel and KITTI datasets and real-world 4K (2160*3840) resolution images. Extensive experiments show that our HCVFlow has very low memory usage and outperforms other memory-efficient methods in terms of accuracy. The code is publicly available at https://github.com/gangweiX/HCVFlow.
Related papers
- Memory-Efficient Optical Flow via Radius-Distribution Orthogonal Cost
Volume [6.122542233250026]
We present MeFlow, a novel memory-efficient method for high-resolution optical flow estimation.
Our method achieves competitive performance on both Sintel and KITTI benchmarks, while maintaining the highest memory efficiency on high-resolution inputs.
arXiv Detail & Related papers (2023-12-06T12:43:11Z) - DIFT: Dynamic Iterative Field Transforms for Memory Efficient Optical
Flow [44.57023882737517]
We introduce a lightweight low-latency and memory-efficient model for optical flow estimation.
DIFT is feasible for edge applications such as mobile, XR, micro UAVs, robotics and cameras.
We demonstrate first real-time cost-volume-based optical flow DL architecture on Snapdragon 8 Gen 1 HTP efficient mobile AI accelerator.
arXiv Detail & Related papers (2023-06-09T06:10:59Z) - FlowFormer: A Transformer Architecture and Its Masked Cost Volume
Autoencoding for Optical Flow [49.40637769535569]
This paper introduces a novel transformer-based network architecture, FlowFormer, along with the Masked Cost Volume AutoVA (MCVA) for pretraining it to tackle the problem of optical flow estimation.
FlowFormer tokenizes the 4D cost-volume built from the source-target image pair and iteratively refines flow estimation with a cost-volume encoder-decoder architecture.
On the Sintel benchmark, FlowFormer architecture achieves 1.16 and 2.09 average end-point-error(AEPE) on the clean and final pass, a 16.5% and 15.5% error reduction from the
arXiv Detail & Related papers (2023-06-08T12:24:04Z) - Dense Optical Flow from Event Cameras [55.79329250951028]
We propose to incorporate feature correlation and sequential processing into dense optical flow estimation from event cameras.
Our proposed approach computes dense optical flow and reduces the end-point error by 23% on MVSEC.
arXiv Detail & Related papers (2021-08-24T07:39:08Z) - Correlate-and-Excite: Real-Time Stereo Matching via Guided Cost Volume
Excitation [65.83008812026635]
We construct Guided Cost volume Excitation (GCE) and show that simple channel excitation of cost volume guided by image can improve performance considerably.
We present an end-to-end network that we call Correlate-and-Excite (CoEx)
arXiv Detail & Related papers (2021-08-12T14:32:26Z) - Learning Optical Flow from a Few Matches [67.83633948984954]
We show that the dense correlation volume representation is redundant and accurate flow estimation can be achieved with only a fraction of elements in it.
Experiments show that our method can reduce computational cost and memory use significantly, while maintaining high accuracy.
arXiv Detail & Related papers (2021-04-05T21:44:00Z) - DCVNet: Dilated Cost Volume Networks for Fast Optical Flow [5.526631378837701]
The cost volume, capturing the similarity of possible correspondences across two input images, is a key ingredient in state-of-the-art optical flow approaches.
We propose an alternative by constructing cost volumes with different dilation factors to capture small and large displacements simultaneously.
A U-Net with skip connections is employed to convert the dilated cost volumes into weights between all possible captured displacements to get the optical flow.
arXiv Detail & Related papers (2021-03-31T17:59:31Z) - Displacement-Invariant Matching Cost Learning for Accurate Optical Flow
Estimation [109.64756528516631]
Learning matching costs have been shown to be critical to the success of the state-of-the-art deep stereo matching methods.
This paper proposes a novel solution that is able to bypass the requirement of building a 5D feature volume.
Our approach achieves state-of-the-art accuracy on various datasets, and outperforms all published optical flow methods on the Sintel benchmark.
arXiv Detail & Related papers (2020-10-28T09:57:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.