A Preliminary Research on Space Situational Awareness Based on Event
Cameras
- URL: http://arxiv.org/abs/2203.13093v2
- Date: Fri, 25 Mar 2022 02:50:58 GMT
- Title: A Preliminary Research on Space Situational Awareness Based on Event
Cameras
- Authors: Kun Xiao, Pengju Li, Guohui Wang, Zhi Li, Yi Chen, Yongfeng Xie,
Yuqiang Fang
- Abstract summary: Event camera is a new type of sensor that is different from traditional cameras.
The trigger event is the change of the brightness irradiated on the pixel.
Compared with traditional cameras, event cameras have the advantages of high temporal resolution, low latency, high dynamic range, low bandwidth and low power consumption.
- Score: 8.27218838055049
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Event camera is a new type of sensor that is different from traditional
cameras. Each pixel is triggered asynchronously by an event. The trigger event
is the change of the brightness irradiated on the pixel. If the increment or
decrement is higher than a certain threshold, the event is output. Compared
with traditional cameras, event cameras have the advantages of high temporal
resolution, low latency, high dynamic range, low bandwidth and low power
consumption. We carried out a series of observation experiments in a simulated
space lighting environment. The experimental results show that the event camera
can give full play to the above advantages in space situational awareness. This
article first introduces the basic principles of the event camera, then
analyzes its advantages and disadvantages, then introduces the observation
experiment and analyzes the experimental results, and finally, a workflow of
space situational awareness based on event cameras is given.
Related papers
- EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [76.02450110026747]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.
We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.
We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Microsaccade-inspired Event Camera for Robotics [42.27082276343167]
We design an event-based perception system capable of simultaneously maintaining low reaction time and stable texture.
The geometrical optics of the rotating wedge prism allows for algorithmic compensation of the additional rotational motion.
Various real-world experiments demonstrate the potential of the system to facilitate robotics perception both for low-level and high-level vision tasks.
arXiv Detail & Related papers (2024-05-28T02:49:46Z) - Deep Event Visual Odometry [40.57142632274148]
Event cameras offer the exciting possibility of tracking the camera's pose during high-speed motion.
Existing event-based monocular visual odometry approaches demonstrate limited performance on recent benchmarks.
We present Deep Event VO (DEVO), the first monocular event-only system with strong performance on a large number of real-world benchmarks.
arXiv Detail & Related papers (2023-12-15T14:00:00Z) - Event-based Camera Tracker by $\nabla$t NeRF [11.572930535988325]
We show that we can recover the camera pose by minimizing the error between sparse events and the temporal gradient of the scene represented as a neural radiance field (NeRF)
We propose an event-based camera pose tracking framework called TeGRA which realizes the pose update by using the sparse event's observation.
arXiv Detail & Related papers (2023-04-07T16:03:21Z) - Are High-Resolution Event Cameras Really Needed? [62.70541164894224]
In low-illumination conditions and at high speeds, low-resolution cameras can outperform high-resolution ones, while requiring a significantly lower bandwidth.
We provide both empirical and theoretical evidence for this claim, which indicates that high-resolution event cameras exhibit higher per-pixel event rates.
In most cases, high-resolution event cameras show a lower task performance, compared to lower resolution sensors in these conditions.
arXiv Detail & Related papers (2022-03-28T12:06:20Z) - E$^2$(GO)MOTION: Motion Augmented Event Stream for Egocentric Action
Recognition [21.199869051111367]
Event cameras capture pixel-level intensity changes in the form of "events"
N-EPIC-Kitchens is the first event-based camera extension of the large-scale EPIC-Kitchens dataset.
We show that event data provides a comparable performance to RGB and optical flow, yet without any additional flow computation at deploy time.
arXiv Detail & Related papers (2021-12-07T09:43:08Z) - Research on Event Accumulator Settings for Event-Based SLAM [6.830610030874817]
Event cameras have advantages of high dynamic range and no motion blur.
We conduct research on how to accumulate event frames to achieve a better event-based SLAM performance.
Experiment results show that our method can achieve better performance in most sequences compared with the state-of-the-art event frame based SLAM algorithm.
arXiv Detail & Related papers (2021-12-01T11:35:17Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - EventHands: Real-Time Neural 3D Hand Reconstruction from an Event Stream [80.15360180192175]
3D hand pose estimation from monocular videos is a long-standing and challenging problem.
We address it for the first time using a single event camera, i.e., an asynchronous vision sensor reacting on brightness changes.
Our approach has characteristics previously not demonstrated with a single RGB or depth camera.
arXiv Detail & Related papers (2020-12-11T16:45:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.