Multi-model Ensemble Analysis with Neural Network Gaussian Processes
- URL: http://arxiv.org/abs/2202.04152v4
- Date: Tue, 11 Apr 2023 01:44:21 GMT
- Title: Multi-model Ensemble Analysis with Neural Network Gaussian Processes
- Authors: Trevor Harris, Bo Li, Ryan Sriver
- Abstract summary: Multi-model ensemble analysis integrates information from multiple climate models into a unified projection.
We propose a statistical approach, called NN-GPR, using a deep neural network based covariance function.
Experiments show that NN-GPR can be highly skillful at surface temperature and precipitation forecasting.
- Score: 5.975698284186638
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multi-model ensemble analysis integrates information from multiple climate
models into a unified projection. However, existing integration approaches
based on model averaging can dilute fine-scale spatial information and incur
bias from rescaling low-resolution climate models. We propose a statistical
approach, called NN-GPR, using Gaussian process regression (GPR) with an
infinitely wide deep neural network based covariance function. NN-GPR requires
no assumptions about the relationships between models, no interpolation to a
common grid, no stationarity assumptions, and automatically downscales as part
of its prediction algorithm. Model experiments show that NN-GPR can be highly
skillful at surface temperature and precipitation forecasting by preserving
geospatial signals at multiple scales and capturing inter-annual variability.
Our projections particularly show improved accuracy and uncertainty
quantification skill in regions of high variability, which allows us to cheaply
assess tail behavior at a 0.44$^\circ$/50 km spatial resolution without a
regional climate model (RCM). Evaluations on reanalysis data and SSP245 forced
climate models show that NN-GPR produces similar, overall climatologies to the
model ensemble while better capturing fine scale spatial patterns. Finally, we
compare NN-GPR's regional predictions against two RCMs and show that NN-GPR can
rival the performance of RCMs using only global model data as input.
Related papers
- On the Convergence of (Stochastic) Gradient Descent for Kolmogorov--Arnold Networks [56.78271181959529]
Kolmogorov--Arnold Networks (KANs) have gained significant attention in the deep learning community.
Empirical investigations demonstrate that KANs optimized via gradient descent (SGD) are capable of achieving near-zero training loss.
arXiv Detail & Related papers (2024-10-10T15:34:10Z) - Neural networks for geospatial data [0.0]
NN-GLS is a new neural network estimation algorithm for the non-linear mean in GP models.
We show that NN-GLS admits a representation as a special type of graph neural network (GNN)
Theoretically, we show that NN-GLS will be consistent for irregularly observed spatially correlated data processes.
arXiv Detail & Related papers (2023-04-18T17:52:23Z) - Satellite Anomaly Detection Using Variance Based Genetic Ensemble of
Neural Networks [7.848121055546167]
We use an efficient ensemble of the predictions from multiple Recurrent Neural Networks (RNNs)
For prediction, each RNN is guided by a Genetic Algorithm (GA) which constructs the optimal structure for each RNN model.
This paper uses the Monte Carlo (MC) dropout as an approximation version of BNNs.
arXiv Detail & Related papers (2023-02-10T22:09:00Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - A new perspective on probabilistic image modeling [92.89846887298852]
We present a new probabilistic approach for image modeling capable of density estimation, sampling and tractable inference.
DCGMMs can be trained end-to-end by SGD from random initial conditions, much like CNNs.
We show that DCGMMs compare favorably to several recent PC and SPN models in terms of inference, classification and sampling.
arXiv Detail & Related papers (2022-03-21T14:53:57Z) - Deep Learning Based Cloud Cover Parameterization for ICON [55.49957005291674]
We train NN based cloud cover parameterizations with coarse-grained data based on realistic regional and global ICON simulations.
Globally trained NNs can reproduce sub-grid scale cloud cover of the regional simulation.
We identify an overemphasis on specific humidity and cloud ice as the reason why our column-based NN cannot perfectly generalize from the global to the regional coarse-grained data.
arXiv Detail & Related papers (2021-12-21T16:10:45Z) - Crime Prediction with Graph Neural Networks and Multivariate Normal
Distributions [18.640610803366876]
We tackle the sparsity problem in high resolution by leveraging the flexible structure of graph convolutional networks (GCNs)
We build our model with Graph Convolutional Gated Recurrent Units (Graph-ConvGRU) to learn spatial, temporal, and categorical relations.
We show that our model is not only generative but also precise.
arXiv Detail & Related papers (2021-11-29T17:37:01Z) - Information Theoretic Structured Generative Modeling [13.117829542251188]
A novel generative model framework called the structured generative model (SGM) is proposed that makes straightforward optimization possible.
The implementation employs a single neural network driven by an orthonormal input to a single white noise source adapted to learn an infinite Gaussian mixture model.
Preliminary results show that SGM significantly improves MINE estimation in terms of data efficiency and variance, conventional and variational Gaussian mixture models, as well as for training adversarial networks.
arXiv Detail & Related papers (2021-10-12T07:44:18Z) - Post-mortem on a deep learning contest: a Simpson's paradox and the
complementary roles of scale metrics versus shape metrics [61.49826776409194]
We analyze a corpus of models made publicly-available for a contest to predict the generalization accuracy of neural network (NN) models.
We identify what amounts to a Simpson's paradox: where "scale" metrics perform well overall but perform poorly on sub partitions of the data.
We present two novel shape metrics, one data-independent, and the other data-dependent, which can predict trends in the test accuracy of a series of NNs.
arXiv Detail & Related papers (2021-06-01T19:19:49Z) - Bayesian Graph Neural Networks with Adaptive Connection Sampling [62.51689735630133]
We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs)
The proposed framework not only alleviates over-smoothing and over-fitting tendencies of deep GNNs, but also enables learning with uncertainty in graph analytic tasks with GNNs.
arXiv Detail & Related papers (2020-06-07T07:06:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.