Autobiasing Event Cameras
- URL: http://arxiv.org/abs/2411.00729v1
- Date: Fri, 01 Nov 2024 16:41:05 GMT
- Title: Autobiasing Event Cameras
- Authors: Mehdi Sefidgar Dilmaghani, Waseem Shariff, Cian Ryan, Joseph Lemley, Peter Corcoran,
- Abstract summary: This paper utilizes the neuromorphic YOLO-based face tracking module of a driver monitoring system as the event-based application to study.
The proposed method uses numerical metrics to continuously monitor the performance of the event-based application in real-time.
The advantage of bias optimization lies in its ability to handle conditions such as flickering or darkness without requiring additional hardware or software.
- Score: 0.932065750652415
- License:
- Abstract: This paper presents an autonomous method to address challenges arising from severe lighting conditions in machine vision applications that use event cameras. To manage these conditions, the research explores the built in potential of these cameras to adjust pixel functionality, named bias settings. As cars are driven at various times and locations, shifts in lighting conditions are unavoidable. Consequently, this paper utilizes the neuromorphic YOLO-based face tracking module of a driver monitoring system as the event-based application to study. The proposed method uses numerical metrics to continuously monitor the performance of the event-based application in real-time. When the application malfunctions, the system detects this through a drop in the metrics and automatically adjusts the event cameras bias values. The Nelder-Mead simplex algorithm is employed to optimize this adjustment, with finetuning continuing until performance returns to a satisfactory level. The advantage of bias optimization lies in its ability to handle conditions such as flickering or darkness without requiring additional hardware or software. To demonstrate the capabilities of the proposed system, it was tested under conditions where detecting human faces with default bias values was impossible. These severe conditions were simulated using dim ambient light and various flickering frequencies. Following the automatic and dynamic process of bias modification, the metrics for face detection significantly improved under all conditions. Autobiasing resulted in an increase in the YOLO confidence indicators by more than 33 percent for object detection and 37 percent for face detection highlighting the effectiveness of the proposed method.
Related papers
- Tuning Event Camera Biases Heuristic for Object Detection Applications in Staring Scenarios [0.0]
We present a parametes for the biases of event cameras, for tasks that require small objects detection in staring scenarios.
The main purpose of the tuning is to squeeze the camera's potential, optimize its performance, and expand its detection capabilities as much as possible.
A main conclusion that will be demonstrated is that for certain desired signals, the optimal values of the camera are very far from the default values recommended by the manufacturer.
arXiv Detail & Related papers (2025-01-30T22:27:56Z) - Enhancing Visual Place Recognition via Fast and Slow Adaptive Biasing in Event Cameras [18.348497200655746]
Event cameras are increasingly popular in robotics due to beneficial features such as low latency, energy efficiency, and high dynamic range.
These parameters regulate the necessary change in light intensity to trigger an event, which in turn depends on factors such as the environment lighting and camera motion.
This paper introduces feedback control algorithms that automatically tune the bias parameters through two interacting methods.
arXiv Detail & Related papers (2024-03-25T05:10:34Z) - DARTH: Holistic Test-time Adaptation for Multiple Object Tracking [87.72019733473562]
Multiple object tracking (MOT) is a fundamental component of perception systems for autonomous driving.
Despite the urge of safety in driving systems, no solution to the MOT adaptation problem to domain shift in test-time conditions has ever been proposed.
We introduce DARTH, a holistic test-time adaptation framework for MOT.
arXiv Detail & Related papers (2023-10-03T10:10:42Z) - E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event Cameras [18.54225086007182]
We present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras.
The proposed method is tested in a variety of rigorous experiments for different event camera models.
arXiv Detail & Related papers (2023-06-15T12:16:38Z) - Using simulation to quantify the performance of automotive perception
systems [2.2320512724449233]
We describe the image system simulation software tools that we use to evaluate the performance of image systems for object (automobile) detection.
We quantified system performance by measuring average precision and we report a trend relating system resolution and object detection performance.
arXiv Detail & Related papers (2023-03-02T05:28:35Z) - Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras [67.84498757689776]
This paper presents the first direct calibration method between event cameras and lidars.
It removes dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements.
arXiv Detail & Related papers (2022-07-03T11:05:45Z) - Event-aided Direct Sparse Odometry [54.602311491827805]
We introduce EDS, a direct monocular visual odometry using events and frames.
Our algorithm leverages the event generation model to track the camera motion in the blind time between frames.
EDS is the first method to perform 6-DOF VO using events and frames with a direct approach.
arXiv Detail & Related papers (2022-04-15T20:40:29Z) - A Quality Index Metric and Method for Online Self-Assessment of
Autonomous Vehicles Sensory Perception [164.93739293097605]
We propose a novel evaluation metric, named as the detection quality index (DQI), which assesses the performance of camera-based object detection algorithms.
We have developed a superpixel-based attention network (SPA-NET) that utilizes raw image pixels and superpixels as input to predict the proposed DQI evaluation metric.
arXiv Detail & Related papers (2022-03-04T22:16:50Z) - Monitoring and Adapting the Physical State of a Camera for Autonomous
Vehicles [10.490646039938252]
We propose a generic and task-oriented self-health-maintenance framework for cameras based on data- and physically-grounded models.
We implement the framework on a real-world ground vehicle and demonstrate how a camera can adjust its parameters to counter a poor condition.
Our framework not only provides a practical ready-to-use solution to monitor and maintain the health of cameras, but can also serve as a basis for extensions to tackle more sophisticated problems.
arXiv Detail & Related papers (2021-12-10T11:14:44Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.