UNet with Axial Transformer : A Neural Weather Model for Precipitation Nowcasting
- URL: http://arxiv.org/abs/2504.19408v1
- Date: Mon, 28 Apr 2025 01:20:30 GMT
- Title: UNet with Axial Transformer : A Neural Weather Model for Precipitation Nowcasting
- Authors: Maitreya Sonawane, Sumit Mamtani,
- Abstract summary: We develop a novel method that employs Transformer-based machine learning models to forecast precipitation.<n>This paper represents an initial research on the dataset used in the domain of next frame prediciton.
- Score: 0.06906005491572399
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Making accurate weather predictions can be particularly challenging for localized storms or events that evolve on hourly timescales, such as thunderstorms. Hence, our goal for the project was to model Weather Nowcasting for making highly localized and accurate predictions that apply to the immediate future replacing the current numerical weather models and data assimilation systems with Deep Learning approaches. A significant advantage of machine learning is that inference is computationally cheap given an already-trained model, allowing forecasts that are nearly instantaneous and in the native high resolution of the input data. In this work we developed a novel method that employs Transformer-based machine learning models to forecast precipitation. This approach works by leveraging axial attention mechanisms to learn complex patterns and dynamics from time series frames. Moreover, it is a generic framework and can be applied to univariate and multivariate time series data, as well as time series embeddings data. This paper represents an initial research on the dataset used in the domain of next frame prediciton, and hence, we demonstrate state-of-the-art results in terms of metrices (PSNR = 47.67, SSIM = 0.9943) used for the given dataset using UNet with Axial Transformer.
Related papers
- Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Generalizing Weather Forecast to Fine-grained Temporal Scales via Physics-AI Hybrid Modeling [55.13352174687475]
This paper proposes a physics-AI hybrid model (i.e., WeatherGFT) which generalizes weather forecasts to finer-grained temporal scales beyond training dataset.<n>Specifically, we employ a carefully designed PDE kernel to simulate physical evolution on a small time scale.<n>We also introduce a lead time-aware training framework to promote the generalization of the model at different lead times.
arXiv Detail & Related papers (2024-05-22T16:21:02Z) - CaFA: Global Weather Forecasting with Factorized Attention on Sphere [7.687215328455751]
We propose a factorized-attention-based model tailored for spherical geometries to mitigate this issue.
The deterministic forecasting accuracy of the proposed model on $1.5circ$ and 0-7 days' lead time is on par with state-of-the-art purely data-driven machine learning weather prediction models.
arXiv Detail & Related papers (2024-05-12T23:18:14Z) - FengWu-4DVar: Coupling the Data-driven Weather Forecasting Model with 4D Variational Assimilation [67.20588721130623]
We develop an AI-based cyclic weather forecasting system, FengWu-4DVar.
FengWu-4DVar can incorporate observational data into the data-driven weather forecasting model.
Experiments on the simulated observational dataset demonstrate that FengWu-4DVar is capable of generating reasonable analysis fields.
arXiv Detail & Related papers (2023-12-16T02:07:56Z) - Learning Robust Precipitation Forecaster by Temporal Frame Interpolation [65.5045412005064]
We develop a robust precipitation forecasting model that demonstrates resilience against spatial-temporal discrepancies.
Our approach has led to significant improvements in forecasting precision, culminating in our model securing textit1st place in the transfer learning leaderboard of the textitWeather4cast'23 competition.
arXiv Detail & Related papers (2023-11-30T08:22:08Z) - ClimaX: A foundation model for weather and climate [51.208269971019504]
ClimaX is a deep learning model for weather and climate science.
It can be pre-trained with a self-supervised learning objective on climate datasets.
It can be fine-tuned to address a breadth of climate and weather tasks.
arXiv Detail & Related papers (2023-01-24T23:19:01Z) - A case study of spatiotemporal forecasting techniques for weather forecasting [4.347494885647007]
The correlations of real-world processes aretemporal, and the data generated by them exhibits both spatial and temporal evolution.
Time series-based models are a viable alternative to numerical forecasts.
We show that decompositiontemporal prediction models reduced computational costs while improving accuracy.
arXiv Detail & Related papers (2022-09-29T13:47:02Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Deep Transformer Models for Time Series Forecasting: The Influenza
Prevalence Case [2.997238772148965]
Time series data are prevalent in many scientific and engineering disciplines.
We present a new approach to time series forecasting using Transformer-based machine learning models.
We show that the forecasting results produced by our approach are favorably comparable to the state-of-the-art.
arXiv Detail & Related papers (2020-01-23T00:22:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.