Synthetic Aperture Sensing for Occlusion Removal with Drone Swarms
- URL: http://arxiv.org/abs/2212.14692v1
- Date: Fri, 30 Dec 2022 13:19:15 GMT
- Title: Synthetic Aperture Sensing for Occlusion Removal with Drone Swarms
- Authors: Rakesh John Amala Arokia Nathan, Indrajit Kurmi and Oliver Bimber
- Abstract summary: We demonstrate how efficient autonomous drone swarms can be in detecting and tracking occluded targets in densely forested areas.
Exploration and optimization of local viewing conditions, such as occlusion density and target view obliqueness, provide much faster and much more reliable results than previous, blind sampling strategies.
- Score: 4.640835690336653
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We demonstrate how efficient autonomous drone swarms can be in detecting and
tracking occluded targets in densely forested areas, such as lost people during
search and rescue missions. Exploration and optimization of local viewing
conditions, such as occlusion density and target view obliqueness, provide much
faster and much more reliable results than previous, blind sampling strategies
that are based on pre-defined waypoints. An adapted real-time particle swarm
optimization and a new objective function are presented that are able to deal
with dynamic and highly random through-foliage conditions. Synthetic aperture
sensing is our fundamental sampling principle, and drone swarms are employed to
approximate the optical signals of extremely wide and adaptable airborne
lenses.
Related papers
- Vision-Based Detection of Uncooperative Targets and Components on Small Satellites [6.999319023465766]
Space debris and inactive satellites pose a threat to the safety and integrity of operational spacecraft.
Recent advancements in computer vision models can be used to improve upon existing methods for tracking such uncooperative targets.
This paper introduces an autonomous detection model designed to identify and monitor these objects using learning and computer vision.
arXiv Detail & Related papers (2024-08-22T02:48:13Z) - An Autonomous Drone Swarm for Detecting and Tracking Anomalies among Dense Vegetation [3.6394530599964026]
We show that swarms of drones can detect and track heavily occluded targets practically feasible.
In our real-life field experiments with a swarm of six drones, we achieved an average positional accuracy of 0.39 m with an average precision of 93.2%.
We show that sensor noise can effectively be included in the synthetic aperture image integration process.
arXiv Detail & Related papers (2024-07-15T14:31:21Z) - OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising [49.86409475232849]
Trajectory prediction is fundamental in computer vision and autonomous driving.
Existing approaches in this field often assume precise and complete observational data.
We present a novel method for out-of-sight trajectory prediction that leverages a vision-positioning technique.
arXiv Detail & Related papers (2024-04-02T18:30:29Z) - Multi-Modal Neural Radiance Field for Monocular Dense SLAM with a
Light-Weight ToF Sensor [58.305341034419136]
We present the first dense SLAM system with a monocular camera and a light-weight ToF sensor.
We propose a multi-modal implicit scene representation that supports rendering both the signals from the RGB camera and light-weight ToF sensor.
Experiments demonstrate that our system well exploits the signals of light-weight ToF sensors and achieves competitive results.
arXiv Detail & Related papers (2023-08-28T07:56:13Z) - ScatterNeRF: Seeing Through Fog with Physically-Based Inverse Neural
Rendering [83.75284107397003]
We introduce ScatterNeRF, a neural rendering method which renders scenes and decomposes the fog-free background.
We propose a disentangled representation for the scattering volume and the scene objects, and learn the scene reconstruction with physics-inspired losses.
We validate our method by capturing multi-view In-the-Wild data and controlled captures in a large-scale fog chamber.
arXiv Detail & Related papers (2023-05-03T13:24:06Z) - Inverse Airborne Optical Sectioning [4.640835690336653]
Inverse Airborne Optical Sectioning (IAOS) is an optical analogy to Inverse Synthetic Aperture Radar (ISAR)
Moving targets, such as walking people, that are heavily occluded by vegetation can be made visible and tracked with a stationary optical sensor.
arXiv Detail & Related papers (2022-07-27T07:57:24Z) - Vision in adverse weather: Augmentation using CycleGANs with various
object detectors for robust perception in autonomous racing [70.16043883381677]
In autonomous racing, the weather can change abruptly, causing significant degradation in perception, resulting in ineffective manoeuvres.
In order to improve detection in adverse weather, deep-learning-based models typically require extensive datasets captured in such conditions.
We introduce an approach of using synthesised adverse condition datasets in autonomous racing (generated using CycleGAN) to improve the performance of four out of five state-of-the-art detectors.
arXiv Detail & Related papers (2022-01-10T10:02:40Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - An Autonomous Drone for Search and Rescue in Forests using Airborne
Optical Sectioning [0.0]
We present a first prototype that finds people fully autonomously in densely occluded forests.
In the course of 17 field experiments conducted over various forest types, our drone found 38 out of 42 hidden persons.
Deep-learning-based person classification is unaffected by sparse and error-prone sampling within one-dimensional synthetic apertures.
arXiv Detail & Related papers (2021-05-10T13:05:22Z) - Fast Automatic Visibility Optimization for Thermal Synthetic Aperture
Visualization [7.133136338850781]
We prove that the visibility of targets in thermal integral images is proportional to the variance of the targets' image.
Our findings have the potential to enable fully autonomous search and recuse operations with camera drones.
arXiv Detail & Related papers (2020-05-08T14:28:03Z) - Temporal Sparse Adversarial Attack on Sequence-based Gait Recognition [56.844587127848854]
We demonstrate that the state-of-the-art gait recognition model is vulnerable to such attacks.
We employ a generative adversarial network based architecture to semantically generate adversarial high-quality gait silhouettes or video frames.
The experimental results show that if only one-fortieth of the frames are attacked, the accuracy of the target model drops dramatically.
arXiv Detail & Related papers (2020-02-22T10:08:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.