Event-based Stereo Depth Estimation from Ego-motion using Ray Density
Fusion
- URL: http://arxiv.org/abs/2210.08927v1
- Date: Mon, 17 Oct 2022 10:33:47 GMT
- Title: Event-based Stereo Depth Estimation from Ego-motion using Ray Density
Fusion
- Authors: Suman Ghosh and Guillermo Gallego
- Abstract summary: Event cameras are bio-inspired sensors that mimic the human retina by responding to brightness changes in the scene.
This work investigates how to estimate depth from stereo event cameras without explicit data association by fusing back-projected ray densities.
- Score: 14.15744053080529
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event cameras are bio-inspired sensors that mimic the human retina by
responding to brightness changes in the scene. They generate asynchronous
spike-based outputs at microsecond resolution, providing advantages over
traditional cameras like high dynamic range, low motion blur and power
efficiency. Most event-based stereo methods attempt to exploit the high
temporal resolution of the camera and the simultaneity of events across cameras
to establish matches and estimate depth. By contrast, this work investigates
how to estimate depth from stereo event cameras without explicit data
association by fusing back-projected ray densities, and demonstrates its
effectiveness on head-mounted camera data, which is recorded in an egocentric
fashion. Code and video are available at https://github.com/tub-rip/dvs_mcemvs
Related papers
- SDGE: Stereo Guided Depth Estimation for 360$^\circ$ Camera Sets [65.64958606221069]
Multi-camera systems are often used in autonomous driving to achieve a 360$circ$ perception.
These 360$circ$ camera sets often have limited or low-quality overlap regions, making multi-view stereo methods infeasible for the entire image.
We propose the Stereo Guided Depth Estimation (SGDE) method, which enhances depth estimation of the full image by explicitly utilizing multi-view stereo results on the overlap.
arXiv Detail & Related papers (2024-02-19T02:41:37Z) - Video Frame Interpolation with Stereo Event and Intensity Camera [40.07341828127157]
We propose a novel Stereo Event-based VFI network (SE-VFI-Net) to generate high-quality intermediate frames.
We exploit the fused features accomplishing accurate optical flow and disparity estimation.
Our proposed SEVFI-Net outperforms state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2023-07-17T04:02:00Z) - Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused
Events Fusion [14.15744053080529]
Event cameras are bio-inspired sensors that offer advantages over traditional cameras.
We tackle the problem of event-based stereo 3D reconstruction for SLAM.
We develop fusion theory and apply it to design multi-camera 3D reconstruction algorithms.
arXiv Detail & Related papers (2022-07-21T14:19:39Z) - MEFNet: Multi-scale Event Fusion Network for Motion Deblurring [62.60878284671317]
Traditional frame-based cameras inevitably suffer from motion blur due to long exposure times.
As a kind of bio-inspired camera, the event camera records the intensity changes in an asynchronous way with high temporal resolution.
In this paper, we rethink the event-based image deblurring problem and unfold it into an end-to-end two-stage image restoration network.
arXiv Detail & Related papers (2021-11-30T23:18:35Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Event Guided Depth Sensing [50.997474285910734]
We present an efficient bio-inspired event-camera-driven depth estimation algorithm.
In our approach, we illuminate areas of interest densely, depending on the scene activity detected by the event camera.
We show the feasibility of our approach in a simulated autonomous driving sequences and real indoor environments.
arXiv Detail & Related papers (2021-10-20T11:41:11Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - Event-based Stereo Visual Odometry [42.77238738150496]
We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig.
We seek to maximize thetemporal consistency of stereo event-based data while using a simple and efficient representation.
arXiv Detail & Related papers (2020-07-30T15:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.