Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused
Events Fusion
- URL: http://arxiv.org/abs/2207.10494v1
- Date: Thu, 21 Jul 2022 14:19:39 GMT
- Title: Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused
Events Fusion
- Authors: Suman Ghosh and Guillermo Gallego
- Abstract summary: Event cameras are bio-inspired sensors that offer advantages over traditional cameras.
We tackle the problem of event-based stereo 3D reconstruction for SLAM.
We develop fusion theory and apply it to design multi-camera 3D reconstruction algorithms.
- Score: 14.15744053080529
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras are bio-inspired sensors that offer advantages over traditional
cameras. They work asynchronously, sampling the scene with microsecond
resolution and producing a stream of brightness changes. This unconventional
output has sparked novel computer vision methods to unlock the camera's
potential. We tackle the problem of event-based stereo 3D reconstruction for
SLAM. Most event-based stereo methods try to exploit the camera's high temporal
resolution and event simultaneity across cameras to establish matches and
estimate depth. By contrast, we investigate how to estimate depth without
explicit data association by fusing Disparity Space Images (DSIs) originated in
efficient monocular methods. We develop fusion theory and apply it to design
multi-camera 3D reconstruction algorithms that produce state-of-the-art
results, as we confirm by comparing against four baseline methods and testing
on a variety of available datasets.
Related papers
- EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [76.02450110026747]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.
We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.
We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - SDGE: Stereo Guided Depth Estimation for 360$^\circ$ Camera Sets [65.64958606221069]
Multi-camera systems are often used in autonomous driving to achieve a 360$circ$ perception.
These 360$circ$ camera sets often have limited or low-quality overlap regions, making multi-view stereo methods infeasible for the entire image.
We propose the Stereo Guided Depth Estimation (SGDE) method, which enhances depth estimation of the full image by explicitly utilizing multi-view stereo results on the overlap.
arXiv Detail & Related papers (2024-02-19T02:41:37Z) - Dense Voxel 3D Reconstruction Using a Monocular Event Camera [5.599072208069752]
Event cameras offer many advantages over conventional frame-based cameras.
Their application in 3D reconstruction for VR applications is underexplored.
We propose a novel approach for solving dense 3D reconstruction using only a single event camera.
arXiv Detail & Related papers (2023-09-01T10:46:57Z) - Shakes on a Plane: Unsupervised Depth Estimation from Unstabilized
Photography [54.36608424943729]
We show that in a ''long-burst'', forty-two 12-megapixel RAW frames captured in a two-second sequence, there is enough parallax information from natural hand tremor alone to recover high-quality scene depth.
We devise a test-time optimization approach that fits a neural RGB-D representation to long-burst data and simultaneously estimates scene depth and camera motion.
arXiv Detail & Related papers (2022-12-22T18:54:34Z) - Event-based Stereo Depth Estimation from Ego-motion using Ray Density
Fusion [14.15744053080529]
Event cameras are bio-inspired sensors that mimic the human retina by responding to brightness changes in the scene.
This work investigates how to estimate depth from stereo event cameras without explicit data association by fusing back-projected ray densities.
arXiv Detail & Related papers (2022-10-17T10:33:47Z) - MEFNet: Multi-scale Event Fusion Network for Motion Deblurring [62.60878284671317]
Traditional frame-based cameras inevitably suffer from motion blur due to long exposure times.
As a kind of bio-inspired camera, the event camera records the intensity changes in an asynchronous way with high temporal resolution.
In this paper, we rethink the event-based image deblurring problem and unfold it into an end-to-end two-stage image restoration network.
arXiv Detail & Related papers (2021-11-30T23:18:35Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - Event-based Stereo Visual Odometry [42.77238738150496]
We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig.
We seek to maximize thetemporal consistency of stereo event-based data while using a simple and efficient representation.
arXiv Detail & Related papers (2020-07-30T15:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.