Enhancing Visual Place Recognition via Fast and Slow Adaptive Biasing in Event Cameras
- URL: http://arxiv.org/abs/2403.16425v2
- Date: Tue, 13 Aug 2024 04:16:48 GMT
- Title: Enhancing Visual Place Recognition via Fast and Slow Adaptive Biasing in Event Cameras
- Authors: Gokul B. Nair, Michael Milford, Tobias Fischer,
- Abstract summary: Event cameras are increasingly popular in robotics due to beneficial features such as low latency, energy efficiency, and high dynamic range.
These parameters regulate the necessary change in light intensity to trigger an event, which in turn depends on factors such as the environment lighting and camera motion.
This paper introduces feedback control algorithms that automatically tune the bias parameters through two interacting methods.
- Score: 18.348497200655746
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras are increasingly popular in robotics due to beneficial features such as low latency, energy efficiency, and high dynamic range. Nevertheless, their downstream task performance is greatly influenced by the optimization of bias parameters. These parameters, for instance, regulate the necessary change in light intensity to trigger an event, which in turn depends on factors such as the environment lighting and camera motion. This paper introduces feedback control algorithms that automatically tune the bias parameters through two interacting methods: 1) An immediate, on-the-fly \textit{fast} adaptation of the refractory period, which sets the minimum interval between consecutive events, and 2) if the event rate exceeds the specified bounds even after changing the refractory period repeatedly, the controller adapts the pixel bandwidth and event thresholds, which stabilizes after a short period of noise events across all pixels (\textit{slow} adaptation). Our evaluation focuses on the visual place recognition task, where incoming query images are compared to a given reference database. We conducted comprehensive evaluations of our algorithms' adaptive feedback control in real-time. To do so, we collected the QCR-Fast-and-Slow dataset that contains DAVIS346 event camera streams from 366 repeated traversals of a Scout Mini robot navigating through a 100 meter long indoor lab setting (totaling over 35km distance traveled) in varying brightness conditions with ground truth location information. Our proposed feedback controllers result in superior performance when compared to the standard bias settings and prior feedback control methods. Our findings also detail the impact of bias adjustments on task performance and feature ablation studies on the fast and slow adaptation mechanisms.
Related papers
- FlexEvent: Event Camera Object Detection at Arbitrary Frequencies [45.82637829492951]
Event cameras offer unparalleled advantages for real-time perception in dynamic environments.
Existing event-based object detection methods are limited by fixed-frequency paradigms.
We propose FlexEvent, a novel event camera object detection framework that enables detection at arbitrary frequencies.
arXiv Detail & Related papers (2024-12-09T17:57:14Z) - Event-Based Tracking Any Point with Motion-Augmented Temporal Consistency [58.719310295870024]
This paper presents an event-based framework for tracking any point.
It tackles the challenges posed by spatial sparsity and motion sensitivity in events.
It achieves 150% faster processing with competitive model parameters.
arXiv Detail & Related papers (2024-12-02T09:13:29Z) - Autobiasing Event Cameras [0.932065750652415]
This paper utilizes the neuromorphic YOLO-based face tracking module of a driver monitoring system as the event-based application to study.
The proposed method uses numerical metrics to continuously monitor the performance of the event-based application in real-time.
The advantage of bias optimization lies in its ability to handle conditions such as flickering or darkness without requiring additional hardware or software.
arXiv Detail & Related papers (2024-11-01T16:41:05Z) - Optimal OnTheFly Feedback Control of Event Sensors [0.14999444543328289]
Event-based vision sensors produce an asynchronous stream of events which are triggered when pixel intensity variation exceeds a threshold.
We propose an approach for dynamic feedback control of activation thresholds, in which a controller network analyzes the past emitted events.
We demonstrate that our approach outperforms both fixed and randomly-varying threshold schemes by 6-12% in terms of LPIPS perceptual image dissimilarity metric.
arXiv Detail & Related papers (2024-08-23T10:49:16Z) - Event-assisted Low-Light Video Object Segmentation [47.28027938310957]
Event cameras offer promise in enhancing object visibility and aiding VOS methods under such low-light conditions.
This paper introduces a pioneering framework tailored for low-light VOS, leveraging event camera data to elevate segmentation accuracy.
arXiv Detail & Related papers (2024-04-02T13:41:22Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - Advancing Unsupervised Low-light Image Enhancement: Noise Estimation, Illumination Interpolation, and Self-Regulation [55.07472635587852]
Low-Light Image Enhancement (LLIE) techniques have made notable advancements in preserving image details and enhancing contrast.
These approaches encounter persistent challenges in efficiently mitigating dynamic noise and accommodating diverse low-light scenarios.
We first propose a method for estimating the noise level in low light images in a quick and accurate way.
We then devise a Learnable Illumination Interpolator (LII) to satisfy general constraints between illumination and input.
arXiv Detail & Related papers (2023-05-17T13:56:48Z) - Illumination-Invariant Active Camera Relocalization for Fine-Grained
Change Detection in the Wild [12.104718944788141]
This paper studies an illumination-invariant active camera relocalization method, it improves both in relative pose estimation and scale estimation.
We construct a linear system to obtain the absolute scale in each ACR by minimizing the image warping error.
Our work greatly expands the feasibility of real-world fine-grained change monitoring tasks for cultural heritages.
arXiv Detail & Related papers (2022-04-13T18:00:55Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Feedback control of event cameras [0.0]
Dynamic vision sensor event cameras produce a variable data rate stream of brightness change events.
Event production at the pixel level is controlled by threshold, bandwidth, and refractory period bias current parameter settings.
This paper proposes fixed-step feedback controllers that use measurements of event rate and noise.
arXiv Detail & Related papers (2021-05-02T07:41:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.