Continuous Event-Line Constraint for Closed-Form Velocity Initialization
- URL: http://arxiv.org/abs/2109.04313v2
- Date: Fri, 10 Sep 2021 13:06:07 GMT
- Title: Continuous Event-Line Constraint for Closed-Form Velocity Initialization
- Authors: Peng Xin, Xu Wanting, Yang Jiaqi, Kneip Laurent
- Abstract summary: Event cameras trigger events asynchronously and independently upon a sufficient change of the logarithmic brightness level.
We propose the continuous event-line constraint, which relies on a constant-velocity motion assumption as well as trifocal geometry in order to express a relationship between line observations given by event clusters as well as first-order camera dynamics.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras trigger events asynchronously and independently upon a
sufficient change of the logarithmic brightness level. The neuromorphic sensor
has several advantages over standard cameras including low latency, absence of
motion blur, and high dynamic range. Event cameras are particularly well suited
to sense motion dynamics in agile scenarios. We propose the continuous
event-line constraint, which relies on a constant-velocity motion assumption as
well as trifocal tensor geometry in order to express a relationship between
line observations given by event clusters as well as first-order camera
dynamics. Our core result is a closed-form solver for up-to-scale linear camera
velocity {with known angular velocity}. Nonlinear optimization is adopted to
improve the performance of the algorithm. The feasibility of the approach is
demonstrated through a careful analysis on both simulated and real data.
Related papers
- EVIT: Event-based Visual-Inertial Tracking in Semi-Dense Maps Using Windowed Nonlinear Optimization [19.915476815328294]
Event cameras are an interesting visual exteroceptive sensor that reacts to brightness changes rather than integrating absolute image intensities.
This paper proposes the addition of inertial signals in order to robustify the estimation.
Our evaluation focuses on a diverse set of real world sequences and comprises a comparison of our proposed method against a purely event-based alternative running at different rates.
arXiv Detail & Related papers (2024-08-02T16:24:55Z) - Tight Fusion of Events and Inertial Measurements for Direct Velocity
Estimation [20.002238735553792]
We propose a novel solution to tight visual-inertial fusion directly at the level of first-order kinematics by employing a dynamic vision sensor instead of a normal camera.
We demonstrate how velocity estimates in highly dynamic situations can be obtained over short time intervals.
Experiments on both simulated and real data demonstrate that the proposed tight event-inertial fusion leads to continuous and reliable velocity estimation.
arXiv Detail & Related papers (2024-01-17T15:56:57Z) - A 5-Point Minimal Solver for Event Camera Relative Motion Estimation [47.45081895021988]
We introduce a novel minimal 5-point solver that estimates line parameters and linear camera velocity projections, which can be fused into a single, averaged linear velocity when considering multiple lines.
Our method consistently achieves a 100% success rate in estimating linear velocity where existing closed-form solvers only achieve between 23% and 70%.
arXiv Detail & Related papers (2023-09-29T08:30:18Z) - Robust e-NeRF: NeRF from Sparse & Noisy Events under Non-Uniform Motion [67.15935067326662]
Event cameras offer low power, low latency, high temporal resolution and high dynamic range.
NeRF is seen as the leading candidate for efficient and effective scene representation.
We propose Robust e-NeRF, a novel method to directly and robustly reconstruct NeRFs from moving event cameras.
arXiv Detail & Related papers (2023-09-15T17:52:08Z) - Minimum Latency Deep Online Video Stabilization [77.68990069996939]
We present a novel camera path optimization framework for the task of online video stabilization.
In this work, we adopt recent off-the-shelf high-quality deep motion models for motion estimation to recover the camera trajectory.
Our approach significantly outperforms state-of-the-art online methods both qualitatively and quantitatively.
arXiv Detail & Related papers (2022-12-05T07:37:32Z) - ParticleSfM: Exploiting Dense Point Trajectories for Localizing Moving
Cameras in the Wild [57.37891682117178]
We present a robust dense indirect structure-from-motion method for videos that is based on dense correspondence from pairwise optical flow.
A novel neural network architecture is proposed for processing irregular point trajectory data.
Experiments on MPI Sintel dataset show that our system produces significantly more accurate camera trajectories.
arXiv Detail & Related papers (2022-07-19T09:19:45Z) - Globally-Optimal Event Camera Motion Estimation [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in HDR conditions and have high temporal resolution.
Event cameras measure asynchronous pixel-level changes and return them in a highly discretised format.
arXiv Detail & Related papers (2022-03-08T08:24:22Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - TimeLens: Event-based Video Frame Interpolation [54.28139783383213]
We introduce Time Lens, a novel indicates equal contribution method that leverages the advantages of both synthesis-based and flow-based approaches.
We show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods.
arXiv Detail & Related papers (2021-06-14T10:33:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.