A Linear Comb Filter for Event Flicker Removal
- URL: http://arxiv.org/abs/2205.08090v1
- Date: Tue, 17 May 2022 04:47:26 GMT
- Title: A Linear Comb Filter for Event Flicker Removal
- Authors: Ziwei Wang, Dingran Yuan, Yonhon Ng and Robert Mahony
- Abstract summary: Event cameras are bio-inspired sensors that capture per-pixel asynchronous intensity change.
Due to their high temporal resolution, event cameras are particularly sensitive to flicker such as from fluorescent or LED lights.
We propose a novel linear filter to preprocess event data to remove unwanted flicker events from an event stream.
- Score: 11.731970880432563
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras are bio-inspired sensors that capture per-pixel asynchronous
intensity change rather than the synchronous absolute intensity frames captured
by a classical camera sensor. Such cameras are ideal for robotics applications
since they have high temporal resolution, high dynamic range and low latency.
However, due to their high temporal resolution, event cameras are particularly
sensitive to flicker such as from fluorescent or LED lights. During every cycle
from bright to dark, pixels that image a flickering light source generate many
events that provide little or no useful information for a robot, swamping the
useful data in the scene. In this paper, we propose a novel linear filter to
preprocess event data to remove unwanted flicker events from an event stream.
The proposed algorithm achieves over 4.6 times relative improvement in the
signal-to-noise ratio when compared to the raw event stream due to the
effective removal of flicker from fluorescent lighting. Thus, it is ideally
suited to robotics applications that operate in indoor settings or scenes
illuminated by flickering light sources.
Related papers
- Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Gradient events: improved acquisition of visual information in event cameras [0.0]
We propose a new type of event, the gradient event, which benefits from the same properties as a conventional brightness event.
We show that the gradient event -based video reconstruction outperforms existing state-of-the-art brightness event -based methods by a significant margin.
arXiv Detail & Related papers (2024-09-03T10:18:35Z) - Event Cameras Meet SPADs for High-Speed, Low-Bandwidth Imaging [25.13346470561497]
Event cameras and single-photon avalanche diode (SPAD) sensors have emerged as promising alternatives to conventional cameras.
We show that these properties are complementary, and can help achieve low-light, high-speed image reconstruction with low bandwidth requirements.
arXiv Detail & Related papers (2024-04-17T16:06:29Z) - Robust e-NeRF: NeRF from Sparse & Noisy Events under Non-Uniform Motion [67.15935067326662]
Event cameras offer low power, low latency, high temporal resolution and high dynamic range.
NeRF is seen as the leading candidate for efficient and effective scene representation.
We propose Robust e-NeRF, a novel method to directly and robustly reconstruct NeRFs from moving event cameras.
arXiv Detail & Related papers (2023-09-15T17:52:08Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - Combining Events and Frames using Recurrent Asynchronous Multimodal
Networks for Monocular Depth Prediction [51.072733683919246]
We introduce Recurrent Asynchronous Multimodal (RAM) networks to handle asynchronous and irregular data from multiple sensors.
Inspired by traditional RNNs, RAM networks maintain a hidden state that is updated asynchronously and can be queried at any time to generate a prediction.
We show an improvement over state-of-the-art methods by up to 30% in terms of mean depth absolute error.
arXiv Detail & Related papers (2021-02-18T13:24:35Z) - An Asynchronous Kalman Filter for Hybrid Event Cameras [13.600773150848543]
Event cameras are ideally suited to capture HDR visual information without blur.
conventional image sensors measure absolute intensity of slowly changing scenes effectively but do poorly on high dynamic range or quickly changing scenes.
We present an event-based video reconstruction pipeline for High Dynamic Range scenarios.
arXiv Detail & Related papers (2020-12-10T11:24:07Z) - Reducing the Sim-to-Real Gap for Event Cameras [64.89183456212069]
Event cameras are paradigm-shifting novel sensors that report asynchronous, per-pixel brightness changes called 'events' with unparalleled low latency.
Recent work has demonstrated impressive results using Convolutional Neural Networks (CNNs) for video reconstruction and optic flow with events.
We present strategies for improving training data for event based CNNs that result in 20-40% boost in performance of existing video reconstruction networks.
arXiv Detail & Related papers (2020-03-20T02:44:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.