Event-based visual place recognition with ensembles of temporal windows
- URL: http://arxiv.org/abs/2006.02826v2
- Date: Thu, 17 Sep 2020 23:23:38 GMT
- Title: Event-based visual place recognition with ensembles of temporal windows
- Authors: Tobias Fischer and Michael Milford
- Abstract summary: Event cameras are bio-inspired sensors capable of providing a continuous stream of events with low latency and high range.
We develop a novel ensemble-based scheme for combining temporal windows of varying lengths that are processed in parallel.
We show that our proposed ensemble scheme significantly outperforms all single-window baselines and conventional model-based ensembles.
- Score: 29.6328152991222
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras are bio-inspired sensors capable of providing a continuous
stream of events with low latency and high dynamic range. As a single event
only carries limited information about the brightness change at a particular
pixel, events are commonly accumulated into spatio-temporal windows for further
processing. However, the optimal window length varies depending on the scene,
camera motion, the task being performed, and other factors. In this research,
we develop a novel ensemble-based scheme for combining temporal windows of
varying lengths that are processed in parallel. For applications where the
increased computational requirements of this approach are not practical, we
also introduce a new "approximate" ensemble scheme that achieves significant
computational efficiencies without unduly compromising the original performance
gains provided by the ensemble approach. We demonstrate our ensemble scheme on
the visual place recognition (VPR) task, introducing a new Brisbane-Event-VPR
dataset with annotated recordings captured using a DAVIS346 color event camera.
We show that our proposed ensemble scheme significantly outperforms all the
single-window baselines and conventional model-based ensembles, irrespective of
the image reconstruction and feature extraction methods used in the VPR
pipeline, and evaluate which ensemble combination technique performs best.
These results demonstrate the significant benefits of ensemble schemes for
event camera processing in the VPR domain and may have relevance to other
related processes, including feature tracking, visual-inertial odometry, and
steering prediction in driving.
Related papers
- ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping sub-problems in parallel.
We build an event-based stereo visual-inertial odometry system on top of our previous direct pipeline Event-based Stereo Visual Odometry.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - Rethinking Efficient and Effective Point-based Networks for Event Camera Classification and Regression: EventMamba [11.400397931501338]
Event cameras efficiently detect changes in ambient light with low latency and high dynamic range while consuming minimal power.
Most current approach to processing event data often involves converting it into frame-based representations.
Point Cloud is a popular representation for 3D processing and is better suited to match the sparse and asynchronous nature of the event camera.
We propose EventMamba, an efficient and effective Point Cloud framework that achieves competitive results even compared to the state-of-the-art (SOTA) frame-based method.
arXiv Detail & Related papers (2024-05-09T21:47:46Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - ViR: Towards Efficient Vision Retention Backbones [97.93707844681893]
We propose a new class of computer vision models, dubbed Vision Retention Networks (ViR)
ViR has dual parallel and recurrent formulations, which strike an optimal balance between fast inference and parallel training with competitive performance.
We have validated the effectiveness of ViR through extensive experiments with different dataset sizes and various image resolutions.
arXiv Detail & Related papers (2023-10-30T16:55:50Z) - Revisiting Event-based Video Frame Interpolation [49.27404719898305]
Dynamic vision sensors or event cameras provide rich complementary information for video frame.
estimating optical flow from events is arguably more difficult than from RGB information.
We propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages.
arXiv Detail & Related papers (2023-07-24T06:51:07Z) - Event-Based Frame Interpolation with Ad-hoc Deblurring [68.97825675372354]
We propose a general method for event-based frame that performs deblurring ad-hoc on input videos.
Our network consistently outperforms state-of-the-art methods on frame, single image deblurring and the joint task of deblurring.
Our code and dataset will be made publicly available.
arXiv Detail & Related papers (2023-01-12T18:19:00Z) - An Event-based Algorithm for Simultaneous 6-DOF Camera Pose Tracking and Mapping [0.0]
Event cameras can output compact visual data based on a change in the intensity in each pixel location asynchronously.
We propose an inertial version of the event-only pipeline to assess its capabilities.
We show it can produce comparable or more accurate results provided the map estimate is reliable.
arXiv Detail & Related papers (2023-01-02T12:16:18Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - A Differentiable Recurrent Surface for Asynchronous Event-Based Data [19.605628378366667]
We propose Matrix-LSTM, a grid of Long Short-Term Memory (LSTM) cells that efficiently process events and learn end-to-end task-dependent event-surfaces.
Compared to existing reconstruction approaches, our learned event-surface shows good flexibility and on optical flow estimation.
It improves the state-of-the-art of event-based object classification on the N-Cars dataset.
arXiv Detail & Related papers (2020-01-10T14:09:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.