Are High-Resolution Event Cameras Really Needed?
- URL: http://arxiv.org/abs/2203.14672v1
- Date: Mon, 28 Mar 2022 12:06:20 GMT
- Title: Are High-Resolution Event Cameras Really Needed?
- Authors: Daniel Gehrig and Davide Scaramuzza
- Abstract summary: In low-illumination conditions and at high speeds, low-resolution cameras can outperform high-resolution ones, while requiring a significantly lower bandwidth.
We provide both empirical and theoretical evidence for this claim, which indicates that high-resolution event cameras exhibit higher per-pixel event rates.
In most cases, high-resolution event cameras show a lower task performance, compared to lower resolution sensors in these conditions.
- Score: 62.70541164894224
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Due to their outstanding properties in challenging conditions, event cameras
have become indispensable in a wide range of applications, ranging from
automotive, computational photography, and SLAM. However, as further
improvements are made to the sensor design, modern event cameras are trending
toward higher and higher sensor resolutions, which result in higher bandwidth
and computational requirements on downstream tasks. Despite this trend, the
benefits of using high-resolution event cameras to solve standard computer
vision tasks are still not clear. In this work, we report the surprising
discovery that, in low-illumination conditions and at high speeds,
low-resolution cameras can outperform high-resolution ones, while requiring a
significantly lower bandwidth. We provide both empirical and theoretical
evidence for this claim, which indicates that high-resolution event cameras
exhibit higher per-pixel event rates, leading to higher temporal noise in
low-illumination conditions and at high speeds. As a result, in most cases,
high-resolution event cameras show a lower task performance, compared to lower
resolution sensors in these conditions. We empirically validate our findings
across several tasks, namely image reconstruction, optical flow estimation, and
camera pose tracking, both on synthetic and real data. We believe that these
findings will provide important guidelines for future trends in event camera
development.
Related papers
- Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Generalized Event Cameras [15.730999915036705]
Event cameras capture the world at high time resolution and with minimal bandwidth requirements.
We design generalized event cameras that inherently preserve scene intensity in a bandwidth-efficient manner.
Our single-photon event cameras are capable of high-speed, high-fidelity imaging at low readout rates.
arXiv Detail & Related papers (2024-07-02T21:48:32Z) - Event Cameras Meet SPADs for High-Speed, Low-Bandwidth Imaging [25.13346470561497]
Event cameras and single-photon avalanche diode (SPAD) sensors have emerged as promising alternatives to conventional cameras.
We show that these properties are complementary, and can help achieve low-light, high-speed image reconstruction with low bandwidth requirements.
arXiv Detail & Related papers (2024-04-17T16:06:29Z) - Robust e-NeRF: NeRF from Sparse & Noisy Events under Non-Uniform Motion [67.15935067326662]
Event cameras offer low power, low latency, high temporal resolution and high dynamic range.
NeRF is seen as the leading candidate for efficient and effective scene representation.
We propose Robust e-NeRF, a novel method to directly and robustly reconstruct NeRFs from moving event cameras.
arXiv Detail & Related papers (2023-09-15T17:52:08Z) - Real-Time Optical Flow for Vehicular Perception with Low- and
High-Resolution Event Cameras [3.845877724862319]
Event cameras capture changes of illumination in the observed scene rather than accumulating light to create images.
We propose an optimized framework for computing optical flow in real-time with both low- and high-resolution event cameras.
We evaluate our approach on both low- and high-resolution driving sequences, and show that it often achieves better results than the current state of the art.
arXiv Detail & Related papers (2021-12-20T15:09:20Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - DSEC: A Stereo Event Camera Dataset for Driving Scenarios [55.79329250951028]
This work presents the first high-resolution, large-scale stereo dataset with event cameras.
The dataset contains 53 sequences collected by driving in a variety of illumination conditions.
It provides ground truth disparity for the development and evaluation of event-based stereo algorithms.
arXiv Detail & Related papers (2021-03-10T12:10:33Z) - Learning to Detect Objects with a 1 Megapixel Event Camera [14.949946376335305]
Event cameras encode visual information with high temporal precision, low data-rate, and high-dynamic range.
Due to the novelty of the field, the performance of event-based systems on many vision tasks is still lower compared to conventional frame-based solutions.
arXiv Detail & Related papers (2020-09-28T16:03:59Z) - Reducing the Sim-to-Real Gap for Event Cameras [64.89183456212069]
Event cameras are paradigm-shifting novel sensors that report asynchronous, per-pixel brightness changes called 'events' with unparalleled low latency.
Recent work has demonstrated impressive results using Convolutional Neural Networks (CNNs) for video reconstruction and optic flow with events.
We present strategies for improving training data for event based CNNs that result in 20-40% boost in performance of existing video reconstruction networks.
arXiv Detail & Related papers (2020-03-20T02:44:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.