Stereo Hybrid Event-Frame (SHEF) Cameras for 3D Perception
- URL: http://arxiv.org/abs/2110.04988v1
- Date: Mon, 11 Oct 2021 04:03:36 GMT
- Title: Stereo Hybrid Event-Frame (SHEF) Cameras for 3D Perception
- Authors: Ziwei Wang, Liyuan Pan, Yonhon Ng, Zheyu Zhuang, Robert Mahony
- Abstract summary: Event cameras address limitations as they report brightness changes of each pixel independently with a fine temporal resolution.
integrated hybrid event-frame sensors (eg., DAVIS) are available, but the quality of data is compromised by coupling at the pixel level in the circuit fabrication of such cameras.
This paper proposes a stereo hybrid event-frame (SHEF) camera system that offers a sensor modality with separate high-quality pure event and pure frame cameras.
- Score: 17.585862399941544
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stereo camera systems play an important role in robotics applications to
perceive the 3D world. However, conventional cameras have drawbacks such as low
dynamic range, motion blur and latency due to the underlying frame-based
mechanism. Event cameras address these limitations as they report the
brightness changes of each pixel independently with a fine temporal resolution,
but they are unable to acquire absolute intensity information directly.
Although integrated hybrid event-frame sensors (eg., DAVIS) are available, the
quality of data is compromised by coupling at the pixel level in the circuit
fabrication of such cameras. This paper proposes a stereo hybrid event-frame
(SHEF) camera system that offers a sensor modality with separate high-quality
pure event and pure frame cameras, overcoming the limitations of each separate
sensor and allowing for stereo depth estimation. We provide a SHEF dataset
targeted at evaluating disparity estimation algorithms and introduce a stereo
disparity estimation algorithm that uses edge information extracted from the
event stream correlated with the edge detected in the frame data. Our disparity
estimation outperforms the state-of-the-art stereo matching algorithm on the
SHEF dataset.
Related papers
- Event-based Asynchronous HDR Imaging by Temporal Incident Light Modulation [54.64335350932855]
We propose a Pixel-Asynchronous HDR imaging system, based on key insights into the challenges in HDR imaging.
Our proposed Asyn system integrates the Dynamic Vision Sensors (DVS) with a set of LCD panels.
The LCD panels modulate the irradiance incident upon the DVS by altering their transparency, thereby triggering the pixel-independent event streams.
arXiv Detail & Related papers (2024-03-14T13:45:09Z) - SDGE: Stereo Guided Depth Estimation for 360$^\circ$ Camera Sets [65.64958606221069]
Multi-camera systems are often used in autonomous driving to achieve a 360$circ$ perception.
These 360$circ$ camera sets often have limited or low-quality overlap regions, making multi-view stereo methods infeasible for the entire image.
We propose the Stereo Guided Depth Estimation (SGDE) method, which enhances depth estimation of the full image by explicitly utilizing multi-view stereo results on the overlap.
arXiv Detail & Related papers (2024-02-19T02:41:37Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Stereo Matching in Time: 100+ FPS Video Stereo Matching for Extended
Reality [65.70936336240554]
Real-time Stereo Matching is a cornerstone algorithm for many Extended Reality (XR) applications, such as indoor 3D understanding, video pass-through, and mixed-reality games.
One of the major difficulties is the lack of high-quality indoor video stereo training datasets captured by head-mounted VR/AR glasses.
We introduce a novel video stereo synthetic dataset that comprises renderings of various indoor scenes and realistic camera motion captured by a 6-DoF moving VR/AR head-mounted display (HMD).
This facilitates the evaluation of existing approaches and promotes further research on indoor augmented reality scenarios.
arXiv Detail & Related papers (2023-09-08T07:53:58Z) - An Asynchronous Linear Filter Architecture for Hybrid Event-Frame Cameras [9.69495347826584]
We present an asynchronous linear filter architecture, fusing event and frame camera data, for HDR video reconstruction and spatial convolution.
The proposed AKF pipeline outperforms other state-of-the-art methods in both absolute intensity error (69.4% reduction) and image similarity indexes (average 35.5% improvement)
arXiv Detail & Related papers (2023-09-03T12:37:59Z) - Video Frame Interpolation with Stereo Event and Intensity Camera [40.07341828127157]
We propose a novel Stereo Event-based VFI network (SE-VFI-Net) to generate high-quality intermediate frames.
We exploit the fused features accomplishing accurate optical flow and disparity estimation.
Our proposed SEVFI-Net outperforms state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2023-07-17T04:02:00Z) - Self-Supervised Intensity-Event Stereo Matching [24.851819610561517]
Event cameras are novel bio-inspired vision sensors that output pixel-level intensity changes in microsecond accuracy.
Event cameras cannot be directly applied to computational imaging tasks due to the inability to obtain high-quality intensity and events simultaneously.
This paper aims to connect a standalone event camera and a modern intensity camera so that the applications can take advantage of both two sensors.
arXiv Detail & Related papers (2022-11-01T14:52:25Z) - Neural Disparity Refinement for Arbitrary Resolution Stereo [67.55946402652778]
We introduce a novel architecture for neural disparity refinement aimed at facilitating deployment of 3D computer vision on cheap and widespread consumer devices.
Our approach relies on a continuous formulation that enables to estimate a refined disparity map at any arbitrary output resolution.
arXiv Detail & Related papers (2021-10-28T18:00:00Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - High-Resolution Depth Maps Based on TOF-Stereo Fusion [27.10059147107254]
We propose a novel TOF-stereo fusion method based on an efficient seed-growing algorithm.
We show that the proposed algorithm outperforms 2D image-based stereo algorithms.
The algorithm potentially exhibits real-time performance on a single CPU.
arXiv Detail & Related papers (2021-07-30T15:11:42Z) - Event-based Stereo Visual Odometry [42.77238738150496]
We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig.
We seek to maximize thetemporal consistency of stereo event-based data while using a simple and efficient representation.
arXiv Detail & Related papers (2020-07-30T15:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.