Efficient Large-scale Nonstationary Spatial Covariance Function
Estimation Using Convolutional Neural Networks
- URL: http://arxiv.org/abs/2306.11487v1
- Date: Tue, 20 Jun 2023 12:17:46 GMT
- Title: Efficient Large-scale Nonstationary Spatial Covariance Function
Estimation Using Convolutional Neural Networks
- Authors: Pratik Nag, Yiping Hong, Sameh Abdulah, Ghulam A. Qadir, Marc G.
Genton, and Ying Sun
- Abstract summary: We use ConvNets to derive subregions from the nonstationary data.
We employ a selection mechanism to identify subregions that exhibit similar behavior to stationary fields.
We assess the performance of the proposed method with synthetic and real datasets at a large scale.
- Score: 3.5455896230714194
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spatial processes observed in various fields, such as climate and
environmental science, often occur on a large scale and demonstrate spatial
nonstationarity. Fitting a Gaussian process with a nonstationary Mat\'ern
covariance is challenging. Previous studies in the literature have tackled this
challenge by employing spatial partitioning techniques to estimate the
parameters that vary spatially in the covariance function. The selection of
partitions is an important consideration, but it is often subjective and lacks
a data-driven approach. To address this issue, in this study, we utilize the
power of Convolutional Neural Networks (ConvNets) to derive subregions from the
nonstationary data. We employ a selection mechanism to identify subregions that
exhibit similar behavior to stationary fields. In order to distinguish between
stationary and nonstationary random fields, we conducted training on ConvNet
using various simulated data. These simulations are generated from Gaussian
processes with Mat\'ern covariance models under a wide range of parameter
settings, ensuring adequate representation of both stationary and nonstationary
spatial data. We assess the performance of the proposed method with synthetic
and real datasets at a large scale. The results revealed enhanced accuracy in
parameter estimations when relying on ConvNet-based partition compared to
traditional user-defined approaches.
Related papers
- Amortized Bayesian Local Interpolation NetworK: Fast covariance parameter estimation for Gaussian Processes [0.04660328753262073]
We propose an Amortized Bayesian Local Interpolation NetworK for fast covariance parameter estimation.
The fast prediction time of these networks allows us to bypass the matrix inversion step, creating large computational speedups.
We show significant increases in computational efficiency over comparable scalable GP methodology.
arXiv Detail & Related papers (2024-11-10T01:26:16Z) - Spatially-Aware Diffusion Models with Cross-Attention for Global Field Reconstruction with Sparse Observations [1.371691382573869]
We develop and enhance score-based diffusion models in field reconstruction tasks.
We introduce a condition encoding approach to construct a tractable mapping mapping between observed and unobserved regions.
We demonstrate the ability of the model to capture possible reconstructions and improve the accuracy of fused results.
arXiv Detail & Related papers (2024-08-30T19:46:23Z) - Efficient Trajectory Inference in Wasserstein Space Using Consecutive Averaging [3.8623569699070353]
Trajectory inference deals with the challenge of reconstructing continuous processes from such observations.
We propose methods for B-spline approximation of point clouds through consecutive averaging that is instrinsic to the Wasserstein space.
We rigorously evaluate our method by providing convergence guarantees and testing it on simulated cell data.
arXiv Detail & Related papers (2024-05-30T04:19:20Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - History Matching for Geological Carbon Storage using Data-Space
Inversion with Spatio-Temporal Data Parameterization [0.0]
In data-space inversion (DSI), history-matched quantities of interest are inferred directly, without constructing posterior geomodels.
This is accomplished efficiently using a set of O(1000) prior simulation results, data parameterization, and posterior sampling within a Bayesian setting.
The new parameterization uses an adversarial autoencoder (AAE) for dimension reduction and a convolutional long short-term memory (convLSTM) network to represent the spatial distribution and temporal evolution of the pressure and saturation fields.
arXiv Detail & Related papers (2023-10-05T00:50:06Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Equivariance Discovery by Learned Parameter-Sharing [153.41877129746223]
We study how to discover interpretable equivariances from data.
Specifically, we formulate this discovery process as an optimization problem over a model's parameter-sharing schemes.
Also, we theoretically analyze the method for Gaussian data and provide a bound on the mean squared gap between the studied discovery scheme and the oracle scheme.
arXiv Detail & Related papers (2022-04-07T17:59:19Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.