ClimAlign: Unsupervised statistical downscaling of climate variables via
normalizing flows
- URL: http://arxiv.org/abs/2008.04679v1
- Date: Tue, 11 Aug 2020 13:01:53 GMT
- Title: ClimAlign: Unsupervised statistical downscaling of climate variables via
normalizing flows
- Authors: Brian Groenke, Luke Madaus, Claire Monteleoni
- Abstract summary: We present ClimAlign, a novel method for unsupervised, generative downscaling using adaptations of recent work in normalizing for variational inference.
We show that our method achieves comparable predictive performance to existing supervised downscaling methods while simultaneously allowing for both conditional and unconditional sampling from the joint distribution over high and low resolution spatial fields.
- Score: 0.7734726150561086
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Downscaling is a landmark task in climate science and meteorology in which
the goal is to use coarse scale, spatio-temporal data to infer values at finer
scales. Statistical downscaling aims to approximate this task using statistical
patterns gleaned from an existing dataset of downscaled values, often obtained
from observations or physical models. In this work, we investigate the
application of deep latent variable learning to the task of statistical
downscaling. We present ClimAlign, a novel method for unsupervised, generative
downscaling using adaptations of recent work in normalizing flows for
variational inference. We evaluate the viability of our method using several
different metrics on two datasets consisting of daily temperature and
precipitation values gridded at low (1 degree latitude/longitude) and high (1/4
and 1/8 degree) resolutions. We show that our method achieves comparable
predictive performance to existing supervised statistical downscaling methods
while simultaneously allowing for both conditional and unconditional sampling
from the joint distribution over high and low resolution spatial fields. We
provide publicly accessible implementations of our method, as well as the
baselines used for comparison, on GitHub.
Related papers
- Causal Representation Learning in Temporal Data via Single-Parent Decoding [66.34294989334728]
Scientific research often seeks to understand the causal structure underlying high-level variables in a system.
Scientists typically collect low-level measurements, such as geographically distributed temperature readings.
We propose a differentiable method, Causal Discovery with Single-parent Decoding, that simultaneously learns the underlying latents and a causal graph over them.
arXiv Detail & Related papers (2024-10-09T15:57:50Z) - Climate Variable Downscaling with Conditional Normalizing Flows [21.2670980628433]
We apply conditional normalizing flows to the task of climate variable downscaling.
We show that the method allows us to assess the predictive uncertainty in terms of standard deviation from the fitted conditional distribution mean.
arXiv Detail & Related papers (2024-05-31T09:20:33Z) - Observation-Guided Meteorological Field Downscaling at Station Scale: A
Benchmark and a New Method [66.80344502790231]
We extend meteorological downscaling to arbitrary scattered station scales and establish a new benchmark and dataset.
Inspired by data assimilation techniques, we integrate observational data into the downscaling process, providing multi-scale observational priors.
Our proposed method outperforms other specially designed baseline models on multiple surface variables.
arXiv Detail & Related papers (2024-01-22T14:02:56Z) - Precipitation Downscaling with Spatiotemporal Video Diffusion [19.004369237435437]
This work extends recent video diffusion models to precipitation super-resolution.
We use a deterministic downscaler followed by a temporally-conditioned diffusion model to capture noise characteristics and high-frequency patterns.
Our analysis, capturing CRPS, MSE, precipitation distributions, and qualitative aspects using California and the Himalayas, establishes our method as a new standard for data-driven precipitation downscaling.
arXiv Detail & Related papers (2023-12-11T02:38:07Z) - ClimaX: A foundation model for weather and climate [51.208269971019504]
ClimaX is a deep learning model for weather and climate science.
It can be pre-trained with a self-supervised learning objective on climate datasets.
It can be fine-tuned to address a breadth of climate and weather tasks.
arXiv Detail & Related papers (2023-01-24T23:19:01Z) - Beyond Ensemble Averages: Leveraging Climate Model Ensembles for Subseasonal Forecasting [10.083361616081874]
This study explores an application of machine learning (ML) models as post-processing tools for subseasonal forecasting.
Lagged numerical ensemble forecasts and observational data, including relative humidity, pressure at sea level, and geopotential height, are incorporated into various ML methods.
For regression, quantile regression, and tercile classification tasks, we consider using linear models, random forests, convolutional neural networks, and stacked models.
arXiv Detail & Related papers (2022-11-29T01:11:04Z) - Hard-Constrained Deep Learning for Climate Downscaling [30.280862393706542]
High-resolution climate and weather data is important to inform long-term decisions on climate adaptation and mitigation.
Forecasting models are limited by computational costs and, therefore, often generate coarse-resolution predictions.
Statistical downscaling, including super-resolution methods from deep learning, can provide an efficient method of upsampling low-resolution data.
arXiv Detail & Related papers (2022-08-08T16:54:01Z) - Fake It Till You Make It: Near-Distribution Novelty Detection by
Score-Based Generative Models [54.182955830194445]
existing models either fail or face a dramatic drop under the so-called near-distribution" setting.
We propose to exploit a score-based generative model to produce synthetic near-distribution anomalous data.
Our method improves the near-distribution novelty detection by 6% and passes the state-of-the-art by 1% to 5% across nine novelty detection benchmarks.
arXiv Detail & Related papers (2022-05-28T02:02:53Z) - X-model: Improving Data Efficiency in Deep Learning with A Minimax Model [78.55482897452417]
We aim at improving data efficiency for both classification and regression setups in deep learning.
To take the power of both worlds, we propose a novel X-model.
X-model plays a minimax game between the feature extractor and task-specific heads.
arXiv Detail & Related papers (2021-10-09T13:56:48Z) - Anomaly Detection at Scale: The Case for Deep Distributional Time Series
Models [14.621700495712647]
Main novelty in our approach is that instead of modeling time series consisting of real values or vectors of real values, we model time series of probability distributions over real values (or vectors)
Our method is amenable to streaming anomaly detection and scales to monitoring for anomalies on millions of time series.
We show that we outperform popular open-source anomaly detection tools by up to 17% average improvement for a real-world data set.
arXiv Detail & Related papers (2020-07-30T15:48:55Z) - Variable Skipping for Autoregressive Range Density Estimation [84.60428050170687]
We show a technique, variable skipping, for accelerating range density estimation over deep autoregressive models.
We show that variable skipping provides 10-100$times$ efficiency improvements when targeting challenging high-quantile error metrics.
arXiv Detail & Related papers (2020-07-10T19:01:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.