Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities
- URL: http://arxiv.org/abs/2207.01927v1
- Date: Tue, 5 Jul 2022 10:00:58 GMT
- Title: Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities
- Authors: Fredrik Svanstr\"om, Fernando Alonso-Fernandez, Cristofer Englund
- Abstract summary: We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
- Score: 66.4525391417921
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automatic detection of flying drones is a key issue where its presence,
specially if unauthorized, can create risky situations or compromise security.
Here, we design and evaluate a multi-sensor drone detection system. In
conjunction with common video cameras and microphone sensors, we explore the
use of thermal infrared cameras, pointed out as a feasible and promising
solution that is scarcely addressed in the related literature. Our solution
integrates a fish-eye camera as well to monitor a wider part of the sky and
steer the other cameras towards objects of interest. The sensing solutions are
complemented with an ADS-B receiver, a GPS receiver, and a radar module,
although the latter has been not included in our final deployment due to its
limited detection range. The thermal camera is shown to be a feasible solution
as good as the video camera, even if the camera employed here has a lower
resolution. Two other novelties of our work are the creation of a new public
dataset of multi-sensor annotated data that expand the number of classes in
comparison to existing ones, as well as the study of the detector performance
as a function of the sensor-to-target distance. Sensor fusion is also explored,
showing that the system can be made more robust in this way, mitigating false
detections of the individual sensors
Related papers
- CamLoPA: A Hidden Wireless Camera Localization Framework via Signal Propagation Path Analysis [59.86280992504629]
CamLoPA is a training-free wireless camera detection and localization framework.
It operates with minimal activity space constraints using low-cost commercial-off-the-shelf (COTS) devices.
It achieves 95.37% snooping camera detection accuracy and an average localization error of 17.23, under the significantly reduced activity space requirements.
arXiv Detail & Related papers (2024-09-23T16:23:50Z) - Cross-Dataset Experimental Study of Radar-Camera Fusion in Bird's-Eye
View [12.723455775659414]
Radar and camera fusion systems have the potential to provide a highly robust and reliable perception system.
Recent advances in camera-based object detection offer new radar-camera fusion possibilities with bird's eye view feature maps.
We propose a novel and flexible fusion network and evaluate its performance on two datasets.
arXiv Detail & Related papers (2023-09-27T08:02:58Z) - Multi-Modal 3D Object Detection by Box Matching [109.43430123791684]
We propose a novel Fusion network by Box Matching (FBMNet) for multi-modal 3D detection.
With the learned assignments between 3D and 2D object proposals, the fusion for detection can be effectively performed by combing their ROI features.
arXiv Detail & Related papers (2023-05-12T18:08:51Z) - CramNet: Camera-Radar Fusion with Ray-Constrained Cross-Attention for
Robust 3D Object Detection [12.557361522985898]
We propose a camera-radar matching network CramNet to fuse the sensor readings from camera and radar in a joint 3D space.
Our method supports training with sensor modality dropout, which leads to robust 3D object detection, even when a camera or radar sensor suddenly malfunctions on a vehicle.
arXiv Detail & Related papers (2022-10-17T17:18:47Z) - Extrinsic Camera Calibration with Semantic Segmentation [60.330549990863624]
We present an extrinsic camera calibration approach that automatizes the parameter estimation by utilizing semantic segmentation information.
Our approach relies on a coarse initial measurement of the camera pose and builds on lidar sensors mounted on a vehicle.
We evaluate our method on simulated and real-world data to demonstrate low error measurements in the calibration results.
arXiv Detail & Related papers (2022-08-08T07:25:03Z) - Emergent Visual Sensors for Autonomous Vehicles [3.3227094421785344]
We review the principles of four novel image sensors: infrared cameras, range-gated cameras, polarization cameras, and event cameras.
Their comparative advantages, existing or potential applications, and corresponding data processing algorithms are presented.
arXiv Detail & Related papers (2022-05-19T08:29:30Z) - A dataset for multi-sensor drone detection [67.75999072448555]
The use of small and remotely controlled unmanned aerial vehicles (UAVs) has increased in recent years.
Most studies on drone detection fail to specify the type of acquisition device, the drone type, the detection range, or the dataset.
We contribute with an annotated multi-sensor database for drone detection that includes infrared and visible videos and audio files.
arXiv Detail & Related papers (2021-11-02T20:52:03Z) - Radar Voxel Fusion for 3D Object Detection [0.0]
This paper develops a low-level sensor fusion network for 3D object detection.
The radar sensor fusion proves especially beneficial in inclement conditions such as rain and night scenes.
arXiv Detail & Related papers (2021-06-26T20:34:12Z) - Real-Time Drone Detection and Tracking With Visible, Thermal and
Acoustic Sensors [66.4525391417921]
A thermal infrared camera is shown to be a feasible solution to the drone detection task.
The detector performance as a function of the sensor-to-target distance is also investigated.
A novel video dataset containing 650 annotated infrared and visible videos of drones, birds, airplanes and helicopters is also presented.
arXiv Detail & Related papers (2020-07-14T23:06:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.