Inertia-Informed Orientation Priors for Event-Based Optical Flow Estimation
- URL: http://arxiv.org/abs/2511.12961v1
- Date: Mon, 17 Nov 2025 04:39:18 GMT
- Title: Inertia-Informed Orientation Priors for Event-Based Optical Flow Estimation
- Authors: Pritam P. Karmokar, William J. Beksi,
- Abstract summary: Event cameras directly encode motion within a scene.<n>Many learning-based and model-based methods exist that estimate event-based optical flow.<n>We introduce a novel biologically dense-inspired hybrid CM that couples spatially visual and sparse motion cues.
- Score: 7.36599004748324
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event cameras, by virtue of their working principle, directly encode motion within a scene. Many learning-based and model-based methods exist that estimate event-based optical flow, however the temporally dense yet spatially sparse nature of events poses significant challenges. To address these issues, contrast maximization (CM) is a prominent model-based optimization methodology that estimates the motion trajectories of events within an event volume by optimally warping them. Since its introduction, the CM framework has undergone a series of refinements by the computer vision community. Nonetheless, it remains a highly non-convex optimization problem. In this paper, we introduce a novel biologically-inspired hybrid CM method for event-based optical flow estimation that couples visual and inertial motion cues. Concretely, we propose the use of orientation maps, derived from camera 3D velocities, as priors to guide the CM process. The orientation maps provide directional guidance and constrain the space of estimated motion trajectories. We show that this orientation-guided formulation leads to improved robustness and convergence in event-based optical flow estimation. The evaluation of our approach on the MVSEC, DSEC, and ECD datasets yields superior accuracy scores over the state of the art.
Related papers
- Event-based Visual Deformation Measurement [76.25283405575108]
Visual Deformation Measurement aims to recover dense deformation fields by tracking surface motion from camera observations.<n>Traditional image-based methods rely on minimal inter-frame motion to constrain the correspondence search space.<n>We propose an event-frame fusion framework that exploits events for temporally dense motion cues and frames for spatially dense precise estimation.
arXiv Detail & Related papers (2026-02-16T01:04:48Z) - E-MoFlow: Learning Egomotion and Optical Flow from Event Data via Implicit Regularization [38.46024197872764]
estimation of optical flow and 6-DoF ego-motion has typically been addressed independently.<n>For neuromorphic vision, the lack of robust data association makes solving the two problems separately an ill-posed challenge.<n>We propose an unsupervised framework that jointly optimize egomotion and optical flow via implicit spatial-temporal and geometric regularization.
arXiv Detail & Related papers (2025-10-14T17:33:44Z) - Motion Segmentation and Egomotion Estimation from Event-Based Normal Flow [8.869407907066005]
This paper introduces a robust framework for motion segmentation and egomotion estimation using event-based normal flow.<n>Our approach exploits the sparse, high-temporal-resolution event data and incorporates geometric constraints between normal flow, scene structure, and inertial measurements.
arXiv Detail & Related papers (2025-07-19T06:11:09Z) - EMoTive: Event-guided Trajectory Modeling for 3D Motion Estimation [59.33052312107478]
Event cameras offer possibilities for 3D motion estimation through continuous adaptive pixel-level responses to scene changes.<n>This paper presents EMove, a novel event-based framework that models-uniform trajectories via event-guided parametric curves.<n>For motion representation, we introduce a density-aware adaptation mechanism to fuse spatial and temporal features under event guidance.<n>The final 3D motion estimation is achieved through multi-temporal sampling of parametric trajectories, flows and depth motion fields.
arXiv Detail & Related papers (2025-03-14T13:15:54Z) - Secrets of Edge-Informed Contrast Maximization for Event-Based Vision [6.735928398631445]
Event cameras capture the motion of intensity gradients (edges) in the image plane in the form of rapid asynchronous events.
Contrast histogram (CM) is an optimization framework that can reverse this effect and produce sharp spatial structures.
We propose a novel hybrid approach that extends CM from uni-modal (events only) to bi-modal (events and edges)
arXiv Detail & Related papers (2024-09-22T22:22:26Z) - Motion-prior Contrast Maximization for Dense Continuous-Time Motion Estimation [34.529280562470746]
We introduce a novel self-supervised loss combining the Contrast Maximization framework with a non-linear motion prior in the form of pixel-level trajectories.
Their effectiveness is demonstrated in two scenarios: In dense continuous-time motion estimation, our method improves the zero-shot performance of a synthetically trained model by 29%.
arXiv Detail & Related papers (2024-07-15T15:18:28Z) - Globally Optimal Event-Based Divergence Estimation for Ventral Landing [55.29096494880328]
Event sensing is a major component in bio-inspired flight guidance and control systems.
We explore the usage of event cameras for predicting time-to-contact with the surface during ventral landing.
This is achieved by estimating divergence (inverse TTC), which is the rate of radial optic flow, from the event stream generated during landing.
Our core contributions are a novel contrast maximisation formulation for event-based divergence estimation, and a branch-and-bound algorithm to exactly maximise contrast and find the optimal divergence value.
arXiv Detail & Related papers (2022-09-27T06:00:52Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - PCA Event-Based Otical Flow for Visual Odometry [0.0]
We present a Principal Component Analysis approach to the problem of event-based optical flow estimation.
We show that the best variant of our proposed method, dedicated to the real-time context of visual odometry, is about two times faster compared to state-of-the-art implementations.
arXiv Detail & Related papers (2021-05-08T18:30:44Z) - Optical Flow Estimation from a Single Motion-blurred Image [66.2061278123057]
Motion blur in an image may have practical interests in fundamental computer vision problems.
We propose a novel framework to estimate optical flow from a single motion-blurred image in an end-to-end manner.
arXiv Detail & Related papers (2021-03-04T12:45:18Z) - Joint Unsupervised Learning of Optical Flow and Egomotion with Bi-Level
Optimization [59.9673626329892]
We exploit the global relationship between optical flow and camera motion using epipolar geometry.
We use implicit differentiation to enable back-propagation through the lower-level geometric optimization layer independent of its implementation.
arXiv Detail & Related papers (2020-02-26T22:28:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.