Tracking Particles Ejected From Active Asteroid Bennu With Event-Based
Vision
- URL: http://arxiv.org/abs/2309.06819v1
- Date: Wed, 13 Sep 2023 09:07:42 GMT
- Title: Tracking Particles Ejected From Active Asteroid Bennu With Event-Based
Vision
- Authors: Lo\"ic J. Azzalini and Dario Izzo
- Abstract summary: The OSIRIS-REx spacecraft relied on the analysis of images captured by onboard navigation cameras to detect particle ejection events.
This work proposes an event-based solution that is dedicated to detection and tracking of centimetre-sized particles.
- Score: 6.464577943887317
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Early detection and tracking of ejecta in the vicinity of small solar system
bodies is crucial to guarantee spacecraft safety and support scientific
observation. During the visit of active asteroid Bennu, the OSIRIS-REx
spacecraft relied on the analysis of images captured by onboard navigation
cameras to detect particle ejection events, which ultimately became one of the
mission's scientific highlights. To increase the scientific return of similar
time-constrained missions, this work proposes an event-based solution that is
dedicated to the detection and tracking of centimetre-sized particles. Unlike a
standard frame-based camera, the pixels of an event-based camera independently
trigger events indicating whether the scene brightness has increased or
decreased at that time and location in the sensor plane. As a result of the
sparse and asynchronous spatiotemporal output, event cameras combine very high
dynamic range and temporal resolution with low-power consumption, which could
complement existing onboard imaging techniques. This paper motivates the use of
a scientific event camera by reconstructing the particle ejection episodes
reported by the OSIRIS-REx mission in a photorealistic scene generator and in
turn, simulating event-based observations. The resulting streams of
spatiotemporal data support future work on event-based multi-object tracking.
Related papers
- Research, Applications and Prospects of Event-Based Pedestrian Detection: A Survey [10.494414329120909]
Event-based cameras, inspired by the biological retina, have evolved into cutting-edge sensors distinguished by their minimal power requirements, negligible latency, superior temporal resolution, and expansive dynamic range.
Event-based cameras address limitations by eschewing extraneous data transmissions and obviating motion blur in high-speed imaging scenarios.
This paper offers an exhaustive review of research and applications particularly in the autonomous driving context.
arXiv Detail & Related papers (2024-07-05T06:17:00Z) - Event-based Structure-from-Orbit [23.97673114572094]
Certain applications in robotics and vision-based navigation require 3D perception of an object undergoing circular or spinning motion in front of a static camera.
We propose event-based structure-from-orbit (eSf), where the aim is to reconstruct the 3D structure of a fast spinning object observed from a static event camera.
arXiv Detail & Related papers (2024-05-10T03:02:03Z) - SpikeMOT: Event-based Multi-Object Tracking with Sparse Motion Features [52.213656737672935]
SpikeMOT is an event-based multi-object tracker.
SpikeMOT uses spiking neural networks to extract sparsetemporal features from event streams associated with objects.
arXiv Detail & Related papers (2023-09-29T05:13:43Z) - On the Generation of a Synthetic Event-Based Vision Dataset for
Navigation and Landing [69.34740063574921]
This paper presents a methodology for generating event-based vision datasets from optimal landing trajectories.
We construct sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility.
We demonstrate that the pipeline can generate realistic event-based representations of surface features by constructing a dataset of 500 trajectories.
arXiv Detail & Related papers (2023-08-01T09:14:20Z) - Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [52.73728442921428]
Review of event-based vSLAM algorithms that exploit the benefits of asynchronous and irregular event streams for localization and mapping tasks.
Paper categorizes event-based vSLAM methods into four main categories: feature-based, direct, motion-compensation, and deep learning methods.
arXiv Detail & Related papers (2023-04-19T16:21:14Z) - A Temporal Densely Connected Recurrent Network for Event-based Human
Pose Estimation [24.367222637492787]
Event camera is an emerging bio-inspired vision sensors that report per-pixel brightness changes asynchronously.
This paper proposes a novel densely connected recurrent architecture to address the problem of incomplete information.
By this recurrent architecture, we can explicitly model not only the sequential but also non-sequential geometric consistency across time steps.
arXiv Detail & Related papers (2022-09-15T04:08:18Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - Event-based Motion Segmentation with Spatio-Temporal Graph Cuts [51.17064599766138]
We have developed a method to identify independently objects acquired with an event-based camera.
The method performs on par or better than the state of the art without having to predetermine the number of expected moving objects.
arXiv Detail & Related papers (2020-12-16T04:06:02Z) - End-to-end Learning of Object Motion Estimation from Retinal Events for
Event-based Object Tracking [35.95703377642108]
We propose a novel deep neural network to learn and regress a parametric object-level motion/transform model for event-based object tracking.
To achieve this goal, we propose a synchronous Time-Surface with Linear Time Decay representation.
We feed the sequence of TSLTD frames to a novel Retinal Motion Regression Network (RMRNet) perform to an end-to-end 5-DoF object motion regression.
arXiv Detail & Related papers (2020-02-14T08:19:50Z) - Asynchronous Tracking-by-Detection on Adaptive Time Surfaces for
Event-based Object Tracking [87.0297771292994]
We propose an Event-based Tracking-by-Detection (ETD) method for generic bounding box-based object tracking.
To achieve this goal, we present an Adaptive Time-Surface with Linear Time Decay (ATSLTD) event-to-frame conversion algorithm.
We compare the proposed ETD method with seven popular object tracking methods, that are based on conventional cameras or event cameras, and two variants of ETD.
arXiv Detail & Related papers (2020-02-13T15:58:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.