There and Back Again: Learning to Simulate Radar Data for Real-World
Applications
- URL: http://arxiv.org/abs/2011.14389v1
- Date: Sun, 29 Nov 2020 15:49:23 GMT
- Title: There and Back Again: Learning to Simulate Radar Data for Real-World
Applications
- Authors: Rob Weston, Oiwi Parker Jones and Ingmar Posner
- Abstract summary: We learn a radar sensor model capable of synthesising faithful radar observations based on simulated elevation maps.
We adopt an adversarial approach to learning a forward sensor model from unaligned radar examples.
We demonstrate the efficacy of our approach by evaluating a down-stream segmentation model trained purely on simulated data in a real-world deployment.
- Score: 21.995474023869388
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Simulating realistic radar data has the potential to significantly accelerate
the development of data-driven approaches to radar processing. However, it is
fraught with difficulty due to the notoriously complex image formation process.
Here we propose to learn a radar sensor model capable of synthesising faithful
radar observations based on simulated elevation maps. In particular, we adopt
an adversarial approach to learning a forward sensor model from unaligned radar
examples. In addition, modelling the backward model encourages the output to
remain aligned to the world state through a cyclical consistency criterion. The
backward model is further constrained to predict elevation maps from real radar
data that are grounded by partial measurements obtained from corresponding
lidar scans. Both models are trained in a joint optimisation. We demonstrate
the efficacy of our approach by evaluating a down-stream segmentation model
trained purely on simulated data in a real-world deployment. This achieves
performance within four percentage points of the same model trained entirely on
real data.
Related papers
- RadSimReal: Bridging the Gap Between Synthetic and Real Data in Radar Object Detection With Simulation [6.0158981171030685]
RadSimReal is an innovative physical radar simulation capable of generating synthetic radar images with accompanying annotations.
Our findings demonstrate that training object detection models on RadSimReal data achieves performance levels comparable to models trained and tested on real data from the same dataset.
This innovative tool has the potential to advance the development of computer vision algorithms for radar-based autonomous driving applications.
arXiv Detail & Related papers (2024-04-28T11:55:50Z) - Radio Map Estimation -- An Open Dataset with Directive Transmitter
Antennas and Initial Experiments [49.61405888107356]
We release a dataset of simulated path loss radio maps together with realistic city maps from real-world locations and aerial images from open datasources.
Initial experiments regarding model architectures, input feature design and estimation of radio maps from aerial images are presented.
arXiv Detail & Related papers (2024-01-12T14:56:45Z) - Pre-training on Synthetic Driving Data for Trajectory Prediction [61.520225216107306]
We propose a pipeline-level solution to mitigate the issue of data scarcity in trajectory forecasting.
We adopt HD map augmentation and trajectory synthesis for generating driving data, and then we learn representations by pre-training on them.
We conduct extensive experiments to demonstrate the effectiveness of our data expansion and pre-training strategies.
arXiv Detail & Related papers (2023-09-18T19:49:22Z) - Diffusion Models for Interferometric Satellite Aperture Radar [73.01013149014865]
Probabilistic Diffusion Models (PDMs) have recently emerged as a very promising class of generative models.
Here, we leverage PDMs to generate several radar-based satellite image datasets.
We show that PDMs succeed in generating images with complex and realistic structures, but that sampling time remains an issue.
arXiv Detail & Related papers (2023-08-31T16:26:17Z) - Super-Resolution Radar Imaging with Sparse Arrays Using a Deep Neural
Network Trained with Enhanced Virtual Data [0.4640835690336652]
This paper introduces a method based on a deep neural network (DNN) that is perfectly capable of processing radar data from extremely thinned radar apertures.
The proposed DNN processing can provide both aliasing-free radar imaging and super-resolution.
It simultaneously delivers nearly the same resolution and image quality as would be achieved with a fully occupied array.
arXiv Detail & Related papers (2023-06-16T13:37:47Z) - Semantic Segmentation of Radar Detections using Convolutions on Point
Clouds [59.45414406974091]
We introduce a deep-learning based method to convolve radar detections into point clouds.
We adapt this algorithm to radar-specific properties through distance-dependent clustering and pre-processing of input point clouds.
Our network outperforms state-of-the-art approaches that are based on PointNet++ on the task of semantic segmentation of radar point clouds.
arXiv Detail & Related papers (2023-05-22T07:09:35Z) - RadarFormer: Lightweight and Accurate Real-Time Radar Object Detection
Model [13.214257841152033]
Radar-centric data sets do not get a lot of attention in the development of deep learning techniques for radar perception.
We propose a transformers-based model, named RadarFormer, that utilizes state-of-the-art developments in vision deep learning.
Our model also introduces a channel-chirp-time merging module that reduces the size and complexity of our models by more than 10 times without compromising accuracy.
arXiv Detail & Related papers (2023-04-17T17:07:35Z) - Learning to Simulate Realistic LiDARs [66.7519667383175]
We introduce a pipeline for data-driven simulation of a realistic LiDAR sensor.
We show that our model can learn to encode realistic effects such as dropped points on transparent surfaces.
We use our technique to learn models of two distinct LiDAR sensors and use them to improve simulated LiDAR data accordingly.
arXiv Detail & Related papers (2022-09-22T13:12:54Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - A Sensitivity Analysis Approach for Evaluating a Radar Simulation for
Virtual Testing of Autonomous Driving Functions [0.0]
We introduce a sensitivity analysis approach for developing and evaluating a radar simulation.
A modular radar system simulation is presented and parameterized to conduct a sensitivity analysis.
We compare the output from the radar model to real driving measurements to ensure a realistic model behavior.
arXiv Detail & Related papers (2020-08-06T15:51:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.