Research on Event Accumulator Settings for Event-Based SLAM
- URL: http://arxiv.org/abs/2112.00427v1
- Date: Wed, 1 Dec 2021 11:35:17 GMT
- Title: Research on Event Accumulator Settings for Event-Based SLAM
- Authors: Kun Xiao, Guohui Wang, Yi Chen, Yongfeng Xie, Hong Li
- Abstract summary: Event cameras have advantages of high dynamic range and no motion blur.
We conduct research on how to accumulate event frames to achieve a better event-based SLAM performance.
Experiment results show that our method can achieve better performance in most sequences compared with the state-of-the-art event frame based SLAM algorithm.
- Score: 6.830610030874817
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Event cameras are a new type of sensors that are different from traditional
cameras. Each pixel is triggered asynchronously by event. The trigger event is
the change of the brightness irradiated on the pixel. If the increment or
decrement of brightness is higher than a certain threshold, an event is output.
Compared with traditional cameras, event cameras have the advantages of high
dynamic range and no motion blur. Accumulating events to frames and using
traditional SLAM algorithm is a direct and efficient way for event-based SLAM.
Different event accumulator settings, such as slice method of event stream,
processing method for no motion, using polarity or not, decay function and
event contribution, can cause quite different accumulating results. We
conducted the research on how to accumulate event frames to achieve a better
event-based SLAM performance. For experiment verification, accumulated event
frames are fed to the traditional SLAM system to construct an event-based SLAM
system. Our strategy of setting event accumulator has been evaluated on the
public dataset. The experiment results show that our method can achieve better
performance in most sequences compared with the state-of-the-art event frame
based SLAM algorithm. In addition, the proposed approach has been tested on a
quadrotor UAV to show the potential of applications in real scenario. Code and
results are open sourced to benefit the research community of event cameras
Related papers
- EvenNICER-SLAM: Event-based Neural Implicit Encoding SLAM [69.83383687049994]
We propose EvenNICER-SLAM, a novel approach to dense visual simultaneous localization and mapping.
EvenNICER-SLAM incorporates event cameras that respond to intensity changes instead of absolute brightness.
Our results suggest the potential for event cameras to improve the robustness of dense SLAM systems against fast camera motion in real-world scenarios.
arXiv Detail & Related papers (2024-10-04T13:52:01Z) - Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - Deformable Neural Radiance Fields using RGB and Event Cameras [65.40527279809474]
We develop a novel method to model the deformable neural radiance fields using RGB and event cameras.
The proposed method uses the asynchronous stream of events and sparse RGB frames.
Experiments conducted on both realistically rendered graphics and real-world datasets demonstrate a significant benefit of the proposed method.
arXiv Detail & Related papers (2023-09-15T14:19:36Z) - Temporal Up-Sampling for Asynchronous Events [0.0]
In low-brightness or slow-moving scenes, events are often sparse and accompanied by noise.
We propose an event temporal up-sampling algorithm to generate more effective and reliable events.
Experimental results show that up-sampling events can provide more effective information and improve the performance of downstream tasks.
arXiv Detail & Related papers (2022-08-18T09:12:08Z) - A Preliminary Research on Space Situational Awareness Based on Event
Cameras [8.27218838055049]
Event camera is a new type of sensor that is different from traditional cameras.
The trigger event is the change of the brightness irradiated on the pixel.
Compared with traditional cameras, event cameras have the advantages of high temporal resolution, low latency, high dynamic range, low bandwidth and low power consumption.
arXiv Detail & Related papers (2022-03-24T14:36:18Z) - E$^2$(GO)MOTION: Motion Augmented Event Stream for Egocentric Action
Recognition [21.199869051111367]
Event cameras capture pixel-level intensity changes in the form of "events"
N-EPIC-Kitchens is the first event-based camera extension of the large-scale EPIC-Kitchens dataset.
We show that event data provides a comparable performance to RGB and optical flow, yet without any additional flow computation at deploy time.
arXiv Detail & Related papers (2021-12-07T09:43:08Z) - EventHands: Real-Time Neural 3D Hand Reconstruction from an Event Stream [80.15360180192175]
3D hand pose estimation from monocular videos is a long-standing and challenging problem.
We address it for the first time using a single event camera, i.e., an asynchronous vision sensor reacting on brightness changes.
Our approach has characteristics previously not demonstrated with a single RGB or depth camera.
arXiv Detail & Related papers (2020-12-11T16:45:34Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.