Real-Time Optical Flow for Vehicular Perception with Low- and
High-Resolution Event Cameras
- URL: http://arxiv.org/abs/2112.10591v1
- Date: Mon, 20 Dec 2021 15:09:20 GMT
- Title: Real-Time Optical Flow for Vehicular Perception with Low- and
High-Resolution Event Cameras
- Authors: Vincent Brebion and Julien Moreau and Franck Davoine
- Abstract summary: Event cameras capture changes of illumination in the observed scene rather than accumulating light to create images.
We propose an optimized framework for computing optical flow in real-time with both low- and high-resolution event cameras.
We evaluate our approach on both low- and high-resolution driving sequences, and show that it often achieves better results than the current state of the art.
- Score: 3.845877724862319
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event cameras capture changes of illumination in the observed scene rather
than accumulating light to create images. Thus, they allow for applications
under high-speed motion and complex lighting conditions, where traditional
framebased sensors show their limits with blur and over- or underexposed
pixels. Thanks to these unique properties, they represent nowadays an highly
attractive sensor for ITS-related applications. Event-based optical flow (EBOF)
has been studied following the rise in popularity of these neuromorphic
cameras. The recent arrival of high-definition neuromorphic sensors, however,
challenges the existing approaches, because of the increased resolution of the
events pixel array and a much higher throughput. As an answer to these points,
we propose an optimized framework for computing optical flow in real-time with
both low- and high-resolution event cameras. We formulate a novel dense
representation for the sparse events flow, in the form of the "inverse
exponential distance surface". It serves as an interim frame, designed for the
use of proven, state-of-the-art frame-based optical flow computation methods.
We evaluate our approach on both low- and high-resolution driving sequences,
and show that it often achieves better results than the current state of the
art, while also reaching higher frame rates, 250Hz at 346 x 260 pixels and 77Hz
at 1280 x 720 pixels.
Related papers
- EventSplat: 3D Gaussian Splatting from Moving Event Cameras for Real-time Rendering [7.392798832833857]
Event cameras offer exceptional temporal resolution and a high dynamic range.
We introduce a method for using event camera data in novel view synthesis via Gaussian Splatting.
arXiv Detail & Related papers (2024-12-10T08:23:58Z) - Event fields: Capturing light fields at high speed, resolution, and dynamic range [9.2152453085337]
"Event Fields" is a new approach that utilizes innovative optical designs for event cameras to capture light fields at high speed.
We develop the underlying mathematical framework for Event Fields and introduce two foundational frameworks to capture them practically.
This novel light-sensing paradigm opens doors to new applications in photography, robotics, and AR/VR, and presents fresh challenges in rendering and machine learning.
arXiv Detail & Related papers (2024-12-09T04:02:49Z) - Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Event-based Background-Oriented Schlieren [18.2247510082534]
Schlieren imaging is an optical technique to observe the flow of transparent media, such as air or water, without any particle seeding.
Event cameras offer potential advantages (high dynamic range, high temporal resolution, and data efficiency) to overcome such limitations due to their bio-inspired sensing principle.
This paper presents a novel technique for perceiving air convection using events and frames by providing the first theoretical analysis that connects event data and schlieren.
arXiv Detail & Related papers (2023-11-01T10:57:20Z) - Robust e-NeRF: NeRF from Sparse & Noisy Events under Non-Uniform Motion [67.15935067326662]
Event cameras offer low power, low latency, high temporal resolution and high dynamic range.
NeRF is seen as the leading candidate for efficient and effective scene representation.
We propose Robust e-NeRF, a novel method to directly and robustly reconstruct NeRFs from moving event cameras.
arXiv Detail & Related papers (2023-09-15T17:52:08Z) - Panoramas from Photons [22.437940699523082]
We present a method capable of estimating extreme scene motion under challenging conditions, such as low light or high dynamic range.
Our method relies on grouping and aggregating frames after-the-fact, in a stratified manner.
We demonstrate the creation of high-quality panoramas under fast motion and extremely low light, and super-resolution results using a custom single-photon camera prototype.
arXiv Detail & Related papers (2023-09-07T16:07:31Z) - E-NeRF: Neural Radiance Fields from a Moving Event Camera [83.91656576631031]
Estimating neural radiance fields (NeRFs) from ideal images has been extensively studied in the computer vision community.
We present E-NeRF, the first method which estimates a volumetric scene representation in the form of a NeRF from a fast-moving event camera.
arXiv Detail & Related papers (2022-08-24T04:53:32Z) - Are High-Resolution Event Cameras Really Needed? [62.70541164894224]
In low-illumination conditions and at high speeds, low-resolution cameras can outperform high-resolution ones, while requiring a significantly lower bandwidth.
We provide both empirical and theoretical evidence for this claim, which indicates that high-resolution event cameras exhibit higher per-pixel event rates.
In most cases, high-resolution event cameras show a lower task performance, compared to lower resolution sensors in these conditions.
arXiv Detail & Related papers (2022-03-28T12:06:20Z) - Globally-Optimal Event Camera Motion Estimation [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in HDR conditions and have high temporal resolution.
Event cameras measure asynchronous pixel-level changes and return them in a highly discretised format.
arXiv Detail & Related papers (2022-03-08T08:24:22Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - TimeLens: Event-based Video Frame Interpolation [54.28139783383213]
We introduce Time Lens, a novel indicates equal contribution method that leverages the advantages of both synthesis-based and flow-based approaches.
We show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods.
arXiv Detail & Related papers (2021-06-14T10:33:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.