Physical Scales Matter: The Role of Receptive Fields and Advection in Satellite-Based Thunderstorm Nowcasting with Convolutional Neural Networks
- URL: http://arxiv.org/abs/2504.09994v2
- Date: Thu, 07 Aug 2025 09:12:40 GMT
- Title: Physical Scales Matter: The Role of Receptive Fields and Advection in Satellite-Based Thunderstorm Nowcasting with Convolutional Neural Networks
- Authors: Christoph Metzl, Kianusch Vahid Yousefnia, Richard Müller, Virginia Poli, Miria Celano, Tobias Bölle,
- Abstract summary: Recent work indicates that incorporating advection into the Machine Learning value chain has improved skill for radar-based precipitation nowcasts.<n>This study investigates the generality by probing the approach on satellite-based thunderstorm nowcasts for the first time.<n>In essence, advection guarantees that thunderstorm patterns relevant for nowcasting are contained in the receptive field at long forecast times.
- Score: 0.43981305860983705
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The focus of nowcasting development is transitioning from physically motivated advection methods to purely data-driven Machine Learning (ML) approaches. Nevertheless, recent work indicates that incorporating advection into the ML value chain has improved skill for radar-based precipitation nowcasts. However, the generality of this approach and the underlying causes remain unexplored. This study investigates the generality by probing the approach on satellite-based thunderstorm nowcasts for the first time. Resorting to a scale argument, we then put forth an explanation when and why skill improvements can be expected. In essence, advection guarantees that thunderstorm patterns relevant for nowcasting are contained in the receptive field at long forecast times. To test our hypotheses, we train ResU-Nets solving segmentation tasks with lightning observations as ground truth. The input of the Baseline Neural Network (BNN) are short time series of multispectral satellite imagery and lightning observations, whereas the Advection-Informed Neural Network (AINN) additionally receives the Lagrangian persistence nowcast of all input channels at the desired forecast time. Overall, we find only a minor skill improvement of the AINN over the BNN when considering fully averaged scores. However, assessing skill conditioned on forecast time and advection speed, we demonstrate that our scale argument correctly predicts the onset of skill improvement of the AINN over the BNN after 2h forecast time. We confirm that, generally, advection becomes gradually more important with longer forecast times and higher advection speeds. Our work accentuates the importance of considering and incorporating the underlying physical scales when designing ML-based forecasting models.
Related papers
- MAD-SmaAt-GNet: A Multimodal Advection-Guided Neural Network for Precipitation Nowcasting [2.0912407740405903]
Deep learning models have shown strong potential for precipitation nowcasting, offering both accuracy and computational efficiency.<n>This paper introduces the Multimodal Advection-Guided Small Attention GNet (MAD-SmaAt-GNet)<n>MAD-SmaAt-GNet reduces the mean squared error (MSE) by 8.9% compared with the baseline SmaAt-UNet for four-step precipitation forecasting up to four hours ahead.
arXiv Detail & Related papers (2026-03-03T10:32:15Z) - Accuracy Law for the Future of Deep Time Series Forecasting [65.46625911002202]
Time series forecasting inherently faces a non-zero error lower bound due to its partially observable and uncertain nature.<n>This paper focuses on a fundamental question: how to estimate the performance upper bound of deep time series forecasting.<n>Based on rigorous statistical tests of over 2,800 newly trained deep forecasters, we discover a significant exponential relationship between the minimum forecasting error of deep models and the complexity of window-wise series patterns.
arXiv Detail & Related papers (2025-10-03T05:18:47Z) - PredNext: Explicit Cross-View Temporal Prediction for Unsupervised Learning in Spiking Neural Networks [70.1286354746363]
Spiking Neural Networks (SNNs) offer a natural platform for unsupervised representation learning.<n>Current unsupervised SNNs employ shallow architectures or localized plasticity rules, limiting their ability to model long-range temporal dependencies.<n>We propose PredNext, which explicitly models temporal relationships through cross-view future Step Prediction and Clip Prediction.
arXiv Detail & Related papers (2025-09-29T14:27:58Z) - Self-supervised Spatial-Temporal Learner for Precipitation Nowcasting [5.365086662531667]
Short-term prediction of weather is essential for making timely and weather-dependent decisions.<n>In this work, we leverage the benefits of self-supervised learning and integrate it with spatial-temporal learning to develop a novel model, SpaT-SparK.
arXiv Detail & Related papers (2024-12-20T14:09:36Z) - Temporal Reversal Regularization for Spiking Neural Networks: Hybrid Spatio-Temporal Invariance for Generalization [3.7748662901422807]
Spiking neural networks (SNNs) have received widespread attention as an ultra-low power computing paradigm.<n>Recent studies have shown that SNNs suffer from severe overfitting, which limits their generalization performance.<n>We propose a simple yet effective Temporal Reversal Regularization to mitigate overfitting during training and facilitate generalization of SNNs.
arXiv Detail & Related papers (2024-08-17T06:23:38Z) - Generating Fine-Grained Causality in Climate Time Series Data for Forecasting and Anomaly Detection [67.40407388422514]
We design a conceptual fine-grained causal model named TBN Granger Causality.
Second, we propose an end-to-end deep generative model called TacSas, which discovers TBN Granger Causality in a generative manner.
We test TacSas on climate benchmark ERA5 for climate forecasting and the extreme weather benchmark of NOAA for extreme weather alerts.
arXiv Detail & Related papers (2024-08-08T06:47:21Z) - Bayesian Neural Networks with Domain Knowledge Priors [52.80929437592308]
We propose a framework for integrating general forms of domain knowledge into a BNN prior.
We show that BNNs using our proposed domain knowledge priors outperform those with standard priors.
arXiv Detail & Related papers (2024-02-20T22:34:53Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - Learning Robust Precipitation Forecaster by Temporal Frame Interpolation [65.5045412005064]
We develop a robust precipitation forecasting model that demonstrates resilience against spatial-temporal discrepancies.
Our approach has led to significant improvements in forecasting precision, culminating in our model securing textit1st place in the transfer learning leaderboard of the textitWeather4cast'23 competition.
arXiv Detail & Related papers (2023-11-30T08:22:08Z) - Streaming Motion Forecasting for Autonomous Driving [71.7468645504988]
We introduce a benchmark that queries future trajectories on streaming data and we refer to it as "streaming forecasting"
Our benchmark inherently captures the disappearance and re-appearance of agents, which is a safety-critical problem yet overlooked by snapshot-based benchmarks.
We propose a plug-and-play meta-algorithm called "Predictive Streamer" that can adapt any snapshot-based forecaster into a streaming forecaster.
arXiv Detail & Related papers (2023-10-02T17:13:16Z) - Single-shot Bayesian approximation for neural networks [0.0]
Deep neural networks (NNs) are known for their high-prediction performances.
NNs are prone to yield unreliable predictions when encountering completely new situations without indicating their uncertainty.
We present a single-shot MC dropout approximation that preserves the advantages of BNNs while being as fast as NNs.
arXiv Detail & Related papers (2023-08-24T13:40:36Z) - Deep Learning for Day Forecasts from Sparse Observations [60.041805328514876]
Deep neural networks offer an alternative paradigm for modeling weather conditions.
MetNet-3 learns from both dense and sparse data sensors and makes predictions up to 24 hours ahead for precipitation, wind, temperature and dew point.
MetNet-3 has a high temporal and spatial resolution, respectively, up to 2 minutes and 1 km as well as a low operational latency.
arXiv Detail & Related papers (2023-06-06T07:07:54Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Uncovering the Missing Pattern: Unified Framework Towards Trajectory
Imputation and Prediction [60.60223171143206]
Trajectory prediction is a crucial undertaking in understanding entity movement or human behavior from observed sequences.
Current methods often assume that the observed sequences are complete while ignoring the potential for missing values.
This paper presents a unified framework, the Graph-based Conditional Variational Recurrent Neural Network (GC-VRNN), which can perform trajectory imputation and prediction simultaneously.
arXiv Detail & Related papers (2023-03-28T14:27:27Z) - Forecasting large-scale circulation regimes using deformable
convolutional neural networks and global spatiotemporal climate data [86.1450118623908]
We investigate a supervised machine learning approach based on deformable convolutional neural networks (deCNNs)
We forecast the North Atlantic-European weather regimes during extended boreal winter for 1 to 15 days into the future.
Due to its wider field of view, we also observe deCNN achieving considerably better performance than regular convolutional neural networks at lead times beyond 5-6 days.
arXiv Detail & Related papers (2022-02-10T11:37:00Z) - Spatial-Temporal-Fusion BNN: Variational Bayesian Feature Layer [77.78479877473899]
We design a spatial-temporal-fusion BNN for efficiently scaling BNNs to large models.
Compared to vanilla BNNs, our approach can greatly reduce the training time and the number of parameters, which contributes to scale BNNs efficiently.
arXiv Detail & Related papers (2021-12-12T17:13:14Z) - RAP-Net: Region Attention Predictive Network for Precipitation
Nowcasting [15.587959542301789]
We propose Recall Attention Mechanism (RAM) to improve the prediction.
The experiments show that the proposed Region Attention Predictive Network (RAP-Net) has outperformed the state-of-art method.
arXiv Detail & Related papers (2021-10-03T15:55:18Z) - Spatially Focused Attack against Spatiotemporal Graph Neural Networks [8.665638585791235]
Deep Spatiotemporal graph neural networks (GNNs) have achieved great success in traffic forecasting applications.
If GNNs are vulnerable in real-world prediction applications, a hacker can easily manipulate the results and cause serious traffic congestion and even a city-scale breakdown.
arXiv Detail & Related papers (2021-09-10T01:31:53Z) - Nowcasting-Nets: Deep Neural Network Structures for Precipitation
Nowcasting Using IMERG [1.9860735109145415]
We use Recurrent and Convolutional deep neural network structures to address the challenge of precipitation nowcasting.
A total of five models are trained using Global Precipitation Measurement (GPM) Integrated Multi-satellitE Retrievals for GPM (IMERG) precipitation data over the Eastern Contiguous United States (CONUS)
The models were designed to provide forecasts with a lead time of up to 1.5 hours and, by using a feedback loop approach, the ability of the models to extend the forecast time to 4.5 hours was also investigated.
arXiv Detail & Related papers (2021-08-16T02:55:32Z) - Accurate and Clear Precipitation Nowcasting with Consecutive Attention
and Rain-map Discrimination [11.686939430992966]
We propose a new deep learning model for precipitation nowcasting that includes both the discrimination and attention techniques.
The model is examined on a newly-built benchmark dataset that contains both radar data and actual rain data.
arXiv Detail & Related papers (2021-02-16T14:22:54Z) - Graph Neural Networks for Improved El Ni\~no Forecasting [0.009620910657090186]
We propose an application of Graph Neural Networks (GNN) to forecast El Nino-Southern Oscillation (ENSO) at long lead times.
Preliminary results are promising and outperform state-of-the-art systems for projections 1 and 3 months ahead.
arXiv Detail & Related papers (2020-12-02T23:40:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.