Globally-Optimal Contrast Maximisation for Event Cameras
- URL: http://arxiv.org/abs/2206.05127v1
- Date: Fri, 10 Jun 2022 14:06:46 GMT
- Title: Globally-Optimal Contrast Maximisation for Event Cameras
- Authors: Xin Peng, Ling Gao, Yifu Wang, Laurent Kneip
- Abstract summary: Event cameras are bio-inspired sensors that perform well in challenging illumination with high temporal resolution.
The pixels of an event camera operate independently and asynchronously.
The flow of events is modelled by a general homographic warping in a space-time volume.
- Score: 30.79931004393174
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras are bio-inspired sensors that perform well in challenging
illumination conditions and have high temporal resolution. However, their
concept is fundamentally different from traditional frame-based cameras. The
pixels of an event camera operate independently and asynchronously. They
measure changes of the logarithmic brightness and return them in the highly
discretised form of time-stamped events indicating a relative change of a
certain quantity since the last event. New models and algorithms are needed to
process this kind of measurements. The present work looks at several motion
estimation problems with event cameras. The flow of the events is modelled by a
general homographic warping in a space-time volume, and the objective is
formulated as a maximisation of contrast within the image of warped events. Our
core contribution consists of deriving globally optimal solutions to these
generally non-convex problems, which removes the dependency on a good initial
guess plaguing existing methods. Our methods rely on branch-and-bound
optimisation and employ novel and efficient, recursive upper and lower bounds
derived for six different contrast estimation functions. The practical validity
of our approach is demonstrated by a successful application to three different
event camera motion estimation problems.
Related papers
- VICAN: Very Efficient Calibration Algorithm for Large Camera Networks [49.17165360280794]
We introduce a novel methodology that extends Pose Graph Optimization techniques.
We consider the bipartite graph encompassing cameras, object poses evolving dynamically, and camera-object relative transformations at each time step.
Our framework retains compatibility with traditional PGO solvers, but its efficacy benefits from a custom-tailored optimization scheme.
arXiv Detail & Related papers (2024-03-25T17:47:03Z) - Density Invariant Contrast Maximization for Neuromorphic Earth
Observations [55.970609838687864]
Contrast (CMax) techniques are widely used in event-based vision systems to estimate the motion parameters of the camera and generate high-contrast images.
These techniques are noise-intolerance and suffer from the multiple extrema problem which arises when the scene contains more noisy events than structure.
Our proposed solution overcomes the multiple extrema and noise-intolerance problems by correcting the warped event before calculating the contrast.
arXiv Detail & Related papers (2023-04-27T12:17:40Z) - Globally-Optimal Event Camera Motion Estimation [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in HDR conditions and have high temporal resolution.
Event cameras measure asynchronous pixel-level changes and return them in a highly discretised format.
arXiv Detail & Related papers (2022-03-08T08:24:22Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Visual Odometry with an Event Camera Using Continuous Ray Warping and
Volumetric Contrast Maximization [31.627936023222052]
We present a new solution to tracking and mapping with an event camera.
The motion of the camera contains both rotation and translation, and the displacements happen in an arbitrarily structured environment.
We introduce a new solution to this problem by performing contrast in 3D.
The practical validity of our approach is supported by an application to AGV motion estimation and 3D reconstruction with a single vehicle-mounted event camera.
arXiv Detail & Related papers (2021-07-07T04:32:57Z) - Estimating Egocentric 3D Human Pose in Global Space [70.7272154474722]
We present a new method for egocentric global 3D body pose estimation using a single-mounted fisheye camera.
Our approach outperforms state-of-the-art methods both quantitatively and qualitatively.
arXiv Detail & Related papers (2021-04-27T20:01:57Z) - Event Camera Calibration of Per-pixel Biased Contrast Threshold [11.252139579961883]
Event cameras output asynchronous events to represent intensity changes with a high temporal resolution.
Currently, most of the existing works use a single contrast threshold to estimate the intensity change of all pixels.
We propose a new event camera model and two calibration approaches which cover event-only cameras and hybrid image-event cameras.
arXiv Detail & Related papers (2020-12-17T03:16:13Z) - Event-based Motion Segmentation with Spatio-Temporal Graph Cuts [51.17064599766138]
We have developed a method to identify independently objects acquired with an event-based camera.
The method performs on par or better than the state of the art without having to predetermine the number of expected moving objects.
arXiv Detail & Related papers (2020-12-16T04:06:02Z) - Unsupervised Feature Learning for Event Data: Direct vs Inverse Problem
Formulation [53.850686395708905]
Event-based cameras record an asynchronous stream of per-pixel brightness changes.
In this paper, we focus on single-layer architectures for representation learning from event data.
We show improvements of up to 9 % in the recognition accuracy compared to the state-of-the-art methods.
arXiv Detail & Related papers (2020-09-23T10:40:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.