EnScale: Temporally-consistent multivariate generative downscaling via proper scoring rules
- URL: http://arxiv.org/abs/2509.26258v1
- Date: Tue, 30 Sep 2025 13:46:14 GMT
- Title: EnScale: Temporally-consistent multivariate generative downscaling via proper scoring rules
- Authors: Maybritt Schillinger, Maxim Samarin, Xinwei Shen, Reto Knutti, Nicolai Meinshausen,
- Abstract summary: We introduce EnScale, a generative machine learning framework that emulates the full GCM-to-RCM map.<n>Compared to state-of-the-art ML downscaling approaches, our setup reduces computational cost by about one order of magnitude.
- Score: 2.009844446990157
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The practical use of future climate projections from global circulation models (GCMs) is often limited by their coarse spatial resolution, requiring downscaling to generate high-resolution data. Regional climate models (RCMs) provide this refinement, but are computationally expensive. To address this issue, machine learning models can learn the downscaling function, mapping coarse GCM outputs to high-resolution fields. Among these, generative approaches aim to capture the full conditional distribution of RCM data given coarse-scale GCM data, which is characterized by large variability and thus challenging to model accurately. We introduce EnScale, a generative machine learning framework that emulates the full GCM-to-RCM map by training on multiple pairs of GCM and corresponding RCM data. It first adjusts large-scale mismatches between GCM and coarsened RCM data, followed by a super-resolution step to generate high-resolution fields. Both steps employ generative models optimized with the energy score, a proper scoring rule. Compared to state-of-the-art ML downscaling approaches, our setup reduces computational cost by about one order of magnitude. EnScale jointly emulates multiple variables -- temperature, precipitation, solar radiation, and wind -- spatially consistent over an area in Central Europe. In addition, we propose a variant EnScale-t that enables temporally consistent downscaling. We establish a comprehensive evaluation framework across various categories including calibration, spatial structure, extremes, and multivariate dependencies. Comparison with diverse benchmarks demonstrates EnScale's strong performance and computational efficiency. EnScale offers a promising approach for accurate and temporally consistent RCM emulation.
Related papers
- FedRAIN-Lite: Federated Reinforcement Algorithms for Improving Idealised Numerical Weather and Climate Models [0.5131152350448099]
Sub-grid parameterisations in climate models are traditionally static and tuned offline.<n>FedRAIN-Lite is a framework that mirrors the spatial decomposition used in general circulation models.<n>Deep Deterministic Policy Gradient consistently outperforms both static and single-agent baselines.
arXiv Detail & Related papers (2025-08-19T23:54:13Z) - Vision Transformers for Multi-Variable Climate Downscaling: Emulating Regional Climate Models with a Shared Encoder and Multi-Decoder Architecture [0.0]
We propose a multi-task, multi-variable Vision Transformer (ViT) architecture with a shared encoder and variable-specific decoders (1EMD)<n>We show that our multi-variable approach achieves positive cross-variable knowledge transfer and consistently outperforms single-variable baselines trained under identical conditions.
arXiv Detail & Related papers (2025-06-12T11:48:41Z) - ACDC: Autoregressive Coherent Multimodal Generation using Diffusion Correction [55.03585818289934]
Autoregressive models (ARMs) and diffusion models (DMs) represent two leading paradigms in generative modeling.
We introduce Autoregressive Coherent multimodal generation with Diffusion Correction (ACDC)
ACDC combines the strengths of both ARMs and DMs at the inference stage without the need for additional fine-tuning.
arXiv Detail & Related papers (2024-10-07T03:22:51Z) - Clusterpath Gaussian Graphical Modeling [0.0]
We introduce the Clusterpath estimator of the Gaussian Graphical Model (CGGM)<n>We group variables together, which in turn results in a block-structured precision matrix.<n>We show that CGGM not only matches, but oftentimes outperforms other state-of-the-art methods for variable clustering in graphical models.
arXiv Detail & Related papers (2024-06-30T10:11:18Z) - Gaussian Ensemble Belief Propagation for Efficient Inference in High-Dimensional Systems [3.6773638205393198]
Efficient inference in high-dimensional models is a central challenge in machine learning.<n>We introduce the Gaussian Ensemble Belief Propagation (GEnBP) algorithm.<n>We show that GEnBP outperforms existing belief methods in terms of accuracy and computational efficiency.
arXiv Detail & Related papers (2024-02-13T03:31:36Z) - Transferability and explainability of deep learning emulators for
regional climate model projections: Perspectives for future applications [0.4821250031784094]
Regional climate models (RCMs) are essential tools for simulating and studying regional climate variability and change.
Deep learning models have been introduced as a cost-effective and promising alternative that requires only short RCM simulations to train the models.
This paper considers the two different emulation approaches proposed in the literature (PP and MOS)
We find that both approaches are able to emulate certain climatological properties of RCMs for different periods and scenarios (soft transferability) but the consistency of the emulation functions differ between approaches.
arXiv Detail & Related papers (2023-11-01T00:44:39Z) - RGM: A Robust Generalizable Matching Model [49.60975442871967]
We propose a deep model for sparse and dense matching, termed RGM (Robust Generalist Matching)
To narrow the gap between synthetic training samples and real-world scenarios, we build a new, large-scale dataset with sparse correspondence ground truth.
We are able to mix up various dense and sparse matching datasets, significantly improving the training diversity.
arXiv Detail & Related papers (2023-10-18T07:30:08Z) - ECoFLaP: Efficient Coarse-to-Fine Layer-Wise Pruning for Vision-Language
Models [70.45441031021291]
Large Vision-Language Models (LVLMs) can understand the world comprehensively by integrating rich information from different modalities.
LVLMs are often problematic due to their massive computational/energy costs and carbon consumption.
We propose Efficient Coarse-to-Fine LayerWise Pruning (ECoFLaP), a two-stage coarse-to-fine weight pruning approach for LVLMs.
arXiv Detail & Related papers (2023-10-04T17:34:00Z) - Scaling Structured Inference with Randomization [64.18063627155128]
We propose a family of dynamic programming (RDP) randomized for scaling structured models to tens of thousands of latent states.
Our method is widely applicable to classical DP-based inference.
It is also compatible with automatic differentiation so can be integrated with neural networks seamlessly.
arXiv Detail & Related papers (2021-12-07T11:26:41Z) - Coded Stochastic ADMM for Decentralized Consensus Optimization with Edge
Computing [113.52575069030192]
Big data, including applications with high security requirements, are often collected and stored on multiple heterogeneous devices, such as mobile devices, drones and vehicles.
Due to the limitations of communication costs and security requirements, it is of paramount importance to extract information in a decentralized manner instead of aggregating data to a fusion center.
We consider the problem of learning model parameters in a multi-agent system with data locally processed via distributed edge nodes.
A class of mini-batch alternating direction method of multipliers (ADMM) algorithms is explored to develop the distributed learning model.
arXiv Detail & Related papers (2020-10-02T10:41:59Z) - Crowd Counting via Hierarchical Scale Recalibration Network [61.09833400167511]
We propose a novel Hierarchical Scale Recalibration Network (HSRNet) to tackle the task of crowd counting.
HSRNet models rich contextual dependencies and recalibrating multiple scale-associated information.
Our approach can ignore various noises selectively and focus on appropriate crowd scales automatically.
arXiv Detail & Related papers (2020-03-07T10:06:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.