EvMAPPER: High Altitude Orthomapping with Event Cameras
- URL: http://arxiv.org/abs/2409.18120v1
- Date: Thu, 26 Sep 2024 17:57:15 GMT
- Title: EvMAPPER: High Altitude Orthomapping with Event Cameras
- Authors: Fernando Cladera, Kenneth Chaney, M. Ani Hsieh, Camillo J. Taylor, Vijay Kumar,
- Abstract summary: This work introduces the first orthomosaic approach using event cameras.
In contrast to existing methods relying only on CMOS cameras, our approach enables map generation even in challenging light conditions.
- Score: 58.86453514045072
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Traditionally, unmanned aerial vehicles (UAVs) rely on CMOS-based cameras to collect images about the world below. One of the most successful applications of UAVs is to generate orthomosaics or orthomaps, in which a series of images are integrated together to develop a larger map. However, the use of CMOS-based cameras with global or rolling shutters mean that orthomaps are vulnerable to challenging light conditions, motion blur, and high-speed motion of independently moving objects under the camera. Event cameras are less sensitive to these issues, as their pixels are able to trigger asynchronously on brightness changes. This work introduces the first orthomosaic approach using event cameras. In contrast to existing methods relying only on CMOS cameras, our approach enables map generation even in challenging light conditions, including direct sunlight and after sunset.
Related papers
- Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Microsaccade-inspired Event Camera for Robotics [42.27082276343167]
We design an event-based perception system capable of simultaneously maintaining low reaction time and stable texture.
The geometrical optics of the rotating wedge prism allows for algorithmic compensation of the additional rotational motion.
Various real-world experiments demonstrate the potential of the system to facilitate robotics perception both for low-level and high-level vision tasks.
arXiv Detail & Related papers (2024-05-28T02:49:46Z) - A Data-Driven Approach for Mitigating Dark Current Noise and Bad Pixels in Complementary Metal Oxide Semiconductor Cameras for Space-based Telescopes [2.4489471766462625]
We introduce a data-driven framework for mitigating dark current noise and bad pixels for CMOS cameras.
Our approach involves two key steps: pixel clustering and function fitting.
Results show a considerable improvement in the detection efficiency of space-based telescopes.
arXiv Detail & Related papers (2024-03-15T11:15:06Z) - Panoramas from Photons [22.437940699523082]
We present a method capable of estimating extreme scene motion under challenging conditions, such as low light or high dynamic range.
Our method relies on grouping and aggregating frames after-the-fact, in a stratified manner.
We demonstrate the creation of high-quality panoramas under fast motion and extremely low light, and super-resolution results using a custom single-photon camera prototype.
arXiv Detail & Related papers (2023-09-07T16:07:31Z) - SmartMocap: Joint Estimation of Human and Camera Motion using
Uncalibrated RGB Cameras [49.110201064166915]
Markerless human motion capture (mocap) from multiple RGB cameras is a widely studied problem.
Existing methods either need calibrated cameras or calibrate them relative to a static camera, which acts as the reference frame for the mocap system.
We propose a mocap method which uses multiple static and moving extrinsically uncalibrated RGB cameras.
arXiv Detail & Related papers (2022-09-28T08:21:04Z) - Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras [67.84498757689776]
This paper presents the first direct calibration method between event cameras and lidars.
It removes dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements.
arXiv Detail & Related papers (2022-07-03T11:05:45Z) - Globally-Optimal Event Camera Motion Estimation [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in HDR conditions and have high temporal resolution.
Event cameras measure asynchronous pixel-level changes and return them in a highly discretised format.
arXiv Detail & Related papers (2022-03-08T08:24:22Z) - Asynchronous Multi-View SLAM [78.49842639404413]
Existing multi-camera SLAM systems assume synchronized shutters for all cameras, which is often not the case in practice.
Our framework integrates a continuous-time motion model to relate information across asynchronous multi-frames during tracking, local mapping, and loop closing.
arXiv Detail & Related papers (2021-01-17T00:50:01Z) - Minimal Solutions for Panoramic Stitching Given Gravity Prior [53.047330182598124]
We propose new minimal solutions to panoramic image stitching of images taken by cameras with coinciding optical centers.
We consider four practical camera configurations, assuming unknown fixed or varying focal length with or without radial distortion.
The solvers are tested both on synthetic scenes and on more than 500k real image pairs from the Sun360 dataset and from scenes captured by us using two smartphones equipped with IMUs.
arXiv Detail & Related papers (2020-12-01T13:17:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.