DiffSR: Learning Radar Reflectivity Synthesis via Diffusion Model from Satellite Observations
- URL: http://arxiv.org/abs/2411.06714v1
- Date: Mon, 11 Nov 2024 04:50:34 GMT
- Title: DiffSR: Learning Radar Reflectivity Synthesis via Diffusion Model from Satellite Observations
- Authors: Xuming He, Zhiwang Zhou, Wenlong Zhang, Xiangyu Zhao, Hao Chen, Shiqi Chen, Lei Bai,
- Abstract summary: We propose a two-stage diffusion-based method called DiffSR to generate high-frequency details and high-value areas.
Our method achieves state-of-the-art (SOTA) results, demonstrating the ability to generate high-frequency details and high-value areas.
- Score: 42.635670495018964
- License:
- Abstract: Weather radar data synthesis can fill in data for areas where ground observations are missing. Existing methods often employ reconstruction-based approaches with MSE loss to reconstruct radar data from satellite observation. However, such methods lead to over-smoothing, which hinders the generation of high-frequency details or high-value observation areas associated with convective weather. To address this issue, we propose a two-stage diffusion-based method called DiffSR. We first pre-train a reconstruction model on global-scale data to obtain radar estimation and then synthesize radar reflectivity by combining radar estimation results with satellite data as conditions for the diffusion model. Extensive experiments show that our method achieves state-of-the-art (SOTA) results, demonstrating the ability to generate high-frequency details and high-value areas.
Related papers
- SRViT: Vision Transformers for Estimating Radar Reflectivity from Satellite Observations at Scale [0.7499722271664147]
We introduce a transformer-based neural network to generate high-resolution (3km) synthetic radar reflectivity fields at scale from geostationary satellite imagery.
This work aims to enhance short-term convective-scale forecasts of high-impact weather events and aid in data assimilation for numerical weather prediction over the United States.
arXiv Detail & Related papers (2024-06-20T20:40:50Z) - Generative Data Assimilation of Sparse Weather Station Observations at Kilometer Scales [5.453657018459705]
We demonstrate the viability of score-based data assimilation in the context of realistically complex km-scale weather.
By incorporating observations from 40 weather stations, 10% lower RMSEs on left-out stations are attained.
It is a ripe time to explore extensions that combine increasingly ambitious regional state generators with an increasing set of in situ, ground-based, and satellite remote sensing data streams.
arXiv Detail & Related papers (2024-06-19T10:28:11Z) - Towards Dense and Accurate Radar Perception Via Efficient Cross-Modal Diffusion Model [4.269423698485249]
This paper proposes a novel approach to dense and accurate mmWave radar point cloud construction via cross-modal learning.
Specifically, we introduce diffusion models, which possess state-of-the-art performance in generative modeling, to predict LiDAR-like point clouds from paired raw radar data.
We validate the proposed method through extensive benchmark comparisons and real-world experiments, demonstrating its superior performance and generalization ability.
arXiv Detail & Related papers (2024-03-13T12:20:20Z) - Observation-Guided Meteorological Field Downscaling at Station Scale: A
Benchmark and a New Method [66.80344502790231]
We extend meteorological downscaling to arbitrary scattered station scales and establish a new benchmark and dataset.
Inspired by data assimilation techniques, we integrate observational data into the downscaling process, providing multi-scale observational priors.
Our proposed method outperforms other specially designed baseline models on multiple surface variables.
arXiv Detail & Related papers (2024-01-22T14:02:56Z) - Diffusion Models for Interferometric Satellite Aperture Radar [73.01013149014865]
Probabilistic Diffusion Models (PDMs) have recently emerged as a very promising class of generative models.
Here, we leverage PDMs to generate several radar-based satellite image datasets.
We show that PDMs succeed in generating images with complex and realistic structures, but that sampling time remains an issue.
arXiv Detail & Related papers (2023-08-31T16:26:17Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - There and Back Again: Learning to Simulate Radar Data for Real-World
Applications [21.995474023869388]
We learn a radar sensor model capable of synthesising faithful radar observations based on simulated elevation maps.
We adopt an adversarial approach to learning a forward sensor model from unaligned radar examples.
We demonstrate the efficacy of our approach by evaluating a down-stream segmentation model trained purely on simulated data in a real-world deployment.
arXiv Detail & Related papers (2020-11-29T15:49:23Z) - LiRaNet: End-to-End Trajectory Prediction using Spatio-Temporal Radar
Fusion [52.59664614744447]
We present LiRaNet, a novel end-to-end trajectory prediction method which utilizes radar sensor information along with widely used lidar and high definition (HD) maps.
automotive radar provides rich, complementary information, allowing for longer range vehicle detection as well as instantaneous velocity measurements.
arXiv Detail & Related papers (2020-10-02T00:13:00Z) - Depth Estimation from Monocular Images and Sparse Radar Data [93.70524512061318]
In this paper, we explore the possibility of achieving a more accurate depth estimation by fusing monocular images and Radar points using a deep neural network.
We find that the noise existing in Radar measurements is one of the main key reasons that prevents one from applying the existing fusion methods.
The experiments are conducted on the nuScenes dataset, which is one of the first datasets which features Camera, Radar, and LiDAR recordings in diverse scenes and weather conditions.
arXiv Detail & Related papers (2020-09-30T19:01:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.