Multistatic-Radar RCS-Signature Recognition of Aerial Vehicles: A Bayesian Fusion Approach
- URL: http://arxiv.org/abs/2402.17987v3
- Date: Fri, 16 Aug 2024 01:37:41 GMT
- Title: Multistatic-Radar RCS-Signature Recognition of Aerial Vehicles: A Bayesian Fusion Approach
- Authors: Michael Potter, Murat Akcakaya, Marius Necsoiu, Gunar Schirner, Deniz Erdogmus, Tales Imbiriba,
- Abstract summary: Radar Automated Target Recognition (RATR) for Unmanned Aerial Vehicles (UAVs) involves transmitting Electromagnetic Waves (EMWs) and performing target type recognition on the received radar echo.
Previous studies highlighted the advantages of multistatic radar configurations over monostatic ones in RATR.
We propose a fully Bayesian RATR framework employing Optimal Bayesian Fusion (OBF) to aggregate classification probability vectors from multiple radars.
- Score: 10.908489565519211
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Radar Automated Target Recognition (RATR) for Unmanned Aerial Vehicles (UAVs) involves transmitting Electromagnetic Waves (EMWs) and performing target type recognition on the received radar echo, crucial for defense and aerospace applications. Previous studies highlighted the advantages of multistatic radar configurations over monostatic ones in RATR. However, fusion methods in multistatic radar configurations often suboptimally combine classification vectors from individual radars probabilistically. To address this, we propose a fully Bayesian RATR framework employing Optimal Bayesian Fusion (OBF) to aggregate classification probability vectors from multiple radars. OBF, based on expected 0-1 loss, updates a Recursive Bayesian Classification (RBC) posterior distribution for target UAV type, conditioned on historical observations across multiple time steps. We evaluate the approach using simulated random walk trajectories for seven drones, correlating target aspect angles to Radar Cross Section (RCS) measurements in an anechoic chamber. Comparing against single radar Automated Target Recognition (ATR) systems and suboptimal fusion methods, our empirical results demonstrate that the OBF method integrated with RBC significantly enhances classification accuracy compared to other fusion methods and single radar configurations.
Related papers
- radarODE: An ODE-Embedded Deep Learning Model for Contactless ECG Reconstruction from Millimeter-Wave Radar [16.52097542165782]
A novel deep learning framework called radarODE is designed to fuse the temporal and morphological features extracted from radar signals and generate ECG.
radarODE achieves better performance compared with the benchmark in terms of missed detection rate, root mean square error, Pearson correlation coefficient with the improvement of 9%, 16% and 19%, respectively.
arXiv Detail & Related papers (2024-08-03T06:07:15Z) - Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar [62.51065633674272]
We introduce Radar Fields - a neural scene reconstruction method designed for active radar imagers.
Our approach unites an explicit, physics-informed sensor model with an implicit neural geometry and reflectance model to directly synthesize raw radar measurements.
We validate the effectiveness of the method across diverse outdoor scenarios, including urban scenes with dense vehicles and infrastructure.
arXiv Detail & Related papers (2024-05-07T20:44:48Z) - Towards Dense and Accurate Radar Perception Via Efficient Cross-Modal Diffusion Model [4.269423698485249]
This paper proposes a novel approach to dense and accurate mmWave radar point cloud construction via cross-modal learning.
Specifically, we introduce diffusion models, which possess state-of-the-art performance in generative modeling, to predict LiDAR-like point clouds from paired raw radar data.
We validate the proposed method through extensive benchmark comparisons and real-world experiments, demonstrating its superior performance and generalization ability.
arXiv Detail & Related papers (2024-03-13T12:20:20Z) - Multi-stage Learning for Radar Pulse Activity Segmentation [51.781832424705094]
Radio signal recognition is a crucial function in electronic warfare.
Precise identification and localisation of radar pulse activities are required by electronic warfare systems.
Deep learning-based radar pulse activity recognition methods have remained largely underexplored.
arXiv Detail & Related papers (2023-12-15T01:56:27Z) - Echoes Beyond Points: Unleashing the Power of Raw Radar Data in
Multi-modality Fusion [74.84019379368807]
We propose a novel method named EchoFusion to skip the existing radar signal processing pipeline.
Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors.
arXiv Detail & Related papers (2023-07-31T09:53:50Z) - Semantic Segmentation of Radar Detections using Convolutions on Point
Clouds [59.45414406974091]
We introduce a deep-learning based method to convolve radar detections into point clouds.
We adapt this algorithm to radar-specific properties through distance-dependent clustering and pre-processing of input point clouds.
Our network outperforms state-of-the-art approaches that are based on PointNet++ on the task of semantic segmentation of radar point clouds.
arXiv Detail & Related papers (2023-05-22T07:09:35Z) - Identifying Coordination in a Cognitive Radar Network -- A
Multi-Objective Inverse Reinforcement Learning Approach [30.65529797672378]
This paper provides a novel multi-objective inverse reinforcement learning approach for detecting coordination among radars.
It also applies to more general problems of inverse detection and learning of multi-objective optimizing systems.
arXiv Detail & Related papers (2022-11-13T17:27:39Z) - RaLiBEV: Radar and LiDAR BEV Fusion Learning for Anchor Box Free Object
Detection Systems [13.046347364043594]
In autonomous driving, LiDAR and radar are crucial for environmental perception.
Recent state-of-the-art works reveal that the fusion of radar and LiDAR can lead to robust detection in adverse weather.
We propose a bird's-eye view fusion learning-based anchor box-free object detection system.
arXiv Detail & Related papers (2022-11-11T10:24:42Z) - Waveform Selection for Radar Tracking in Target Channels With Memory via
Universal Learning [14.796960833031724]
Adapting the radar's waveform using partial information about the state of the scene has been shown to provide performance benefits in many practical scenarios.
This work examines a radar system which builds a compressed model of the radar-environment interface in the form of a context-tree.
The proposed approach is tested in a simulation study, and is shown to provide tracking performance improvements over two state-of-the-art waveform selection schemes.
arXiv Detail & Related papers (2021-08-02T21:27:56Z) - Automotive Radar Interference Mitigation with Unfolded Robust PCA based
on Residual Overcomplete Auto-Encoder Blocks [88.46770122522697]
In autonomous driving, radar systems play an important role in detecting targets such as other vehicles on the road.
Deep learning methods for automotive radar interference mitigation can succesfully estimate the amplitude of targets, but fail to recover the phase of the respective targets.
We propose an efficient and effective technique that is able to estimate both amplitude and phase in the presence of interference.
arXiv Detail & Related papers (2020-10-14T09:41:06Z) - Depth Estimation from Monocular Images and Sparse Radar Data [93.70524512061318]
In this paper, we explore the possibility of achieving a more accurate depth estimation by fusing monocular images and Radar points using a deep neural network.
We find that the noise existing in Radar measurements is one of the main key reasons that prevents one from applying the existing fusion methods.
The experiments are conducted on the nuScenes dataset, which is one of the first datasets which features Camera, Radar, and LiDAR recordings in diverse scenes and weather conditions.
arXiv Detail & Related papers (2020-09-30T19:01:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.