Real-Time Optical Flow for Vehicular Perception with Low- and
High-Resolution Event Cameras
- URL: http://arxiv.org/abs/2112.10591v1
- Date: Mon, 20 Dec 2021 15:09:20 GMT
- Title: Real-Time Optical Flow for Vehicular Perception with Low- and
High-Resolution Event Cameras
- Authors: Vincent Brebion and Julien Moreau and Franck Davoine
- Abstract summary: Event cameras capture changes of illumination in the observed scene rather than accumulating light to create images.
We propose an optimized framework for computing optical flow in real-time with both low- and high-resolution event cameras.
We evaluate our approach on both low- and high-resolution driving sequences, and show that it often achieves better results than the current state of the art.
- Score: 3.845877724862319
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event cameras capture changes of illumination in the observed scene rather
than accumulating light to create images. Thus, they allow for applications
under high-speed motion and complex lighting conditions, where traditional
framebased sensors show their limits with blur and over- or underexposed
pixels. Thanks to these unique properties, they represent nowadays an highly
attractive sensor for ITS-related applications. Event-based optical flow (EBOF)
has been studied following the rise in popularity of these neuromorphic
cameras. The recent arrival of high-definition neuromorphic sensors, however,
challenges the existing approaches, because of the increased resolution of the
events pixel array and a much higher throughput. As an answer to these points,
we propose an optimized framework for computing optical flow in real-time with
both low- and high-resolution event cameras. We formulate a novel dense
representation for the sparse events flow, in the form of the "inverse
exponential distance surface". It serves as an interim frame, designed for the
use of proven, state-of-the-art frame-based optical flow computation methods.
We evaluate our approach on both low- and high-resolution driving sequences,
and show that it often achieves better results than the current state of the
art, while also reaching higher frame rates, 250Hz at 346 x 260 pixels and 77Hz
at 1280 x 720 pixels.
Related papers
- Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Event-based Background-Oriented Schlieren [18.2247510082534]
Schlieren imaging is an optical technique to observe the flow of transparent media, such as air or water, without any particle seeding.
Event cameras offer potential advantages (high dynamic range, high temporal resolution, and data efficiency) to overcome such limitations due to their bio-inspired sensing principle.
This paper presents a novel technique for perceiving air convection using events and frames by providing the first theoretical analysis that connects event data and schlieren.
arXiv Detail & Related papers (2023-11-01T10:57:20Z) - Robust e-NeRF: NeRF from Sparse & Noisy Events under Non-Uniform Motion [67.15935067326662]
Event cameras offer low power, low latency, high temporal resolution and high dynamic range.
NeRF is seen as the leading candidate for efficient and effective scene representation.
We propose Robust e-NeRF, a novel method to directly and robustly reconstruct NeRFs from moving event cameras.
arXiv Detail & Related papers (2023-09-15T17:52:08Z) - Panoramas from Photons [22.437940699523082]
We present a method capable of estimating extreme scene motion under challenging conditions, such as low light or high dynamic range.
Our method relies on grouping and aggregating frames after-the-fact, in a stratified manner.
We demonstrate the creation of high-quality panoramas under fast motion and extremely low light, and super-resolution results using a custom single-photon camera prototype.
arXiv Detail & Related papers (2023-09-07T16:07:31Z) - An Asynchronous Linear Filter Architecture for Hybrid Event-Frame Cameras [9.69495347826584]
We present an asynchronous linear filter architecture, fusing event and frame camera data, for HDR video reconstruction and spatial convolution.
The proposed AKF pipeline outperforms other state-of-the-art methods in both absolute intensity error (69.4% reduction) and image similarity indexes (average 35.5% improvement)
arXiv Detail & Related papers (2023-09-03T12:37:59Z) - A direct time-of-flight image sensor with in-pixel surface detection and
dynamic vision [0.0]
3D flash LIDAR is an alternative to the traditional scanning LIDAR systems, promising precise depth imaging in a compact form factor.
We present a 64x32 pixel (256x128 SPAD) dToF imager that overcomes these limitations by using pixels with embedded histogramming.
This reduces the size of output data frames considerably, enabling maximum frame rates in the 10 kFPS range or 100 kFPS for direct depth readings.
arXiv Detail & Related papers (2022-09-23T14:38:00Z) - E-NeRF: Neural Radiance Fields from a Moving Event Camera [83.91656576631031]
Estimating neural radiance fields (NeRFs) from ideal images has been extensively studied in the computer vision community.
We present E-NeRF, the first method which estimates a volumetric scene representation in the form of a NeRF from a fast-moving event camera.
arXiv Detail & Related papers (2022-08-24T04:53:32Z) - Are High-Resolution Event Cameras Really Needed? [62.70541164894224]
In low-illumination conditions and at high speeds, low-resolution cameras can outperform high-resolution ones, while requiring a significantly lower bandwidth.
We provide both empirical and theoretical evidence for this claim, which indicates that high-resolution event cameras exhibit higher per-pixel event rates.
In most cases, high-resolution event cameras show a lower task performance, compared to lower resolution sensors in these conditions.
arXiv Detail & Related papers (2022-03-28T12:06:20Z) - Globally-Optimal Event Camera Motion Estimation [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in HDR conditions and have high temporal resolution.
Event cameras measure asynchronous pixel-level changes and return them in a highly discretised format.
arXiv Detail & Related papers (2022-03-08T08:24:22Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - TimeLens: Event-based Video Frame Interpolation [54.28139783383213]
We introduce Time Lens, a novel indicates equal contribution method that leverages the advantages of both synthesis-based and flow-based approaches.
We show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods.
arXiv Detail & Related papers (2021-06-14T10:33:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.