Convolutional conditional neural processes for local climate downscaling
- URL: http://arxiv.org/abs/2101.07950v1
- Date: Wed, 20 Jan 2021 03:45:21 GMT
- Title: Convolutional conditional neural processes for local climate downscaling
- Authors: Anna Vaughan, Will Tebbutt, J.Scott Hosking and Richard E. Turner
- Abstract summary: A new model is presented for multisite statistical downscaling of temperature and precipitation using convolutional conditional neural processes (convCNPs)
The convCNP model is shown to outperform an ensemble of existing downscaling techniques over Europe for both temperature and precipitation.
substantial improvement is seen in the representation of extreme precipitation events.
- Score: 31.887343372542805
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A new model is presented for multisite statistical downscaling of temperature
and precipitation using convolutional conditional neural processes (convCNPs).
ConvCNPs are a recently developed class of models that allow deep learning
techniques to be applied to off-the-grid spatio-temporal data. This model has a
substantial advantage over existing downscaling methods in that the trained
model can be used to generate multisite predictions at an arbitrary set of
locations, regardless of the availability of training data. The convCNP model
is shown to outperform an ensemble of existing downscaling techniques over
Europe for both temperature and precipitation taken from the VALUE
intercomparison project. The model also outperforms an approach that uses
Gaussian processes to interpolate single-site downscaling models at unseen
locations. Importantly, substantial improvement is seen in the representation
of extreme precipitation events. These results indicate that the convCNP is a
robust downscaling model suitable for generating localised projections for use
in climate impact studies, and motivates further research into applications of
deep learning techniques in statistical downscaling.
Related papers
- Supervised Score-Based Modeling by Gradient Boosting [49.556736252628745]
We propose a Supervised Score-based Model (SSM) which can be viewed as a gradient boosting algorithm combining score matching.
We provide a theoretical analysis of learning and sampling for SSM to balance inference time and prediction accuracy.
Our model outperforms existing models in both accuracy and inference time.
arXiv Detail & Related papers (2024-11-02T07:06:53Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Efficient Localized Adaptation of Neural Weather Forecasting: A Case Study in the MENA Region [62.09891513612252]
We focus on limited-area modeling and train our model specifically for localized region-level downstream tasks.
We consider the MENA region due to its unique climatic challenges, where accurate localized weather forecasting is crucial for managing water resources, agriculture and mitigating the impacts of extreme weather events.
Our study aims to validate the effectiveness of integrating parameter-efficient fine-tuning (PEFT) methodologies, specifically Low-Rank Adaptation (LoRA) and its variants, to enhance forecast accuracy, as well as training speed, computational resource utilization, and memory efficiency in weather and climate modeling for specific regions.
arXiv Detail & Related papers (2024-09-11T19:31:56Z) - Bridging Model-Based Optimization and Generative Modeling via Conservative Fine-Tuning of Diffusion Models [54.132297393662654]
We introduce a hybrid method that fine-tunes cutting-edge diffusion models by optimizing reward models through RL.
We demonstrate the capability of our approach to outperform the best designs in offline data, leveraging the extrapolation capabilities of reward models.
arXiv Detail & Related papers (2024-05-30T03:57:29Z) - Efficient modeling of sub-kilometer surface wind with Gaussian processes and neural networks [0.0]
Wind represents a particularly challenging variable to model due to its high spatial and temporal variability.
This paper presents a novel approach that integrates Gaussian processes (GPs) and neural networks to model surface wind gusts.
We discuss the effect of different modeling choices, as well as different degrees of approximation, and present our results for a case study.
arXiv Detail & Related papers (2024-05-21T09:07:47Z) - Physics-constrained deep learning postprocessing of temperature and
humidity [0.0]
We propose to achieve physical consistency in deep learning-based postprocessing models.
We find that constraining a neural network to enforce thermodynamic state equations yields physically-consistent predictions.
arXiv Detail & Related papers (2022-12-07T09:31:25Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - TRU-NET: A Deep Learning Approach to High Resolution Prediction of
Rainfall [21.399707529966474]
We present TRU-NET, an encoder-decoder model featuring a novel 2D cross attention mechanism between contiguous convolutional-recurrent layers.
We use a conditional-continuous loss function to capture the zero-skewed %extreme event patterns of rainfall.
Experiments show that our model consistently attains lower RMSE and MAE scores than a DL model prevalent in short term precipitation prediction.
arXiv Detail & Related papers (2020-08-20T17:27:59Z) - Statistical Downscaling of Temperature Distributions from the Synoptic
Scale to the Mesoscale Using Deep Convolutional Neural Networks [0.0]
One of the promising applications is developing a statistical surrogate model that converts the output images of low-resolution dynamic models to high-resolution images.
Our study evaluates a surrogate model that downscales synoptic temperature fields to mesoscale temperature fields every 6 hours.
If the surrogate models are implemented at short time intervals, they will provide high-resolution weather forecast guidance or environment emergency alerts at low cost.
arXiv Detail & Related papers (2020-07-20T06:24:08Z) - VAE-LIME: Deep Generative Model Based Approach for Local Data-Driven
Model Interpretability Applied to the Ironmaking Industry [70.10343492784465]
It is necessary to expose to the process engineer, not solely the model predictions, but also their interpretability.
Model-agnostic local interpretability solutions based on LIME have recently emerged to improve the original method.
We present in this paper a novel approach, VAE-LIME, for local interpretability of data-driven models forecasting the temperature of the hot metal produced by a blast furnace.
arXiv Detail & Related papers (2020-07-15T07:07:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.