Spatio-temporal DeepKriging for Interpolation and Probabilistic
Forecasting
- URL: http://arxiv.org/abs/2306.11472v1
- Date: Tue, 20 Jun 2023 11:51:44 GMT
- Title: Spatio-temporal DeepKriging for Interpolation and Probabilistic
Forecasting
- Authors: Pratik Nag, Ying Sun, Brian J Reich
- Abstract summary: We propose a deep neural network (DNN) based two-stage model fortemporal-temporal and forecasting.
We adopt the quant-based loss function in the processes to provide probabilistic forecasting.
It is suitable for large-scale prediction of complex-temporal processes.
- Score: 2.494500339152185
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian processes (GP) and Kriging are widely used in traditional
spatio-temporal mod-elling and prediction. These techniques typically
presuppose that the data are observed from a stationary GP with parametric
covariance structure. However, processes in real-world applications often
exhibit non-Gaussianity and nonstationarity. Moreover, likelihood-based
inference for GPs is computationally expensive and thus prohibitive for large
datasets. In this paper we propose a deep neural network (DNN) based two-stage
model for spatio-temporal interpolation and forecasting. Interpolation is
performed in the first step, which utilizes a dependent DNN with the embedding
layer constructed with spatio-temporal basis functions. For the second stage,
we use Long-Short Term Memory (LSTM) and convolutional LSTM to forecast future
observations at a given location. We adopt the quantile-based loss function in
the DNN to provide probabilistic forecasting. Compared to Kriging, the proposed
method does not require specifying covariance functions or making stationarity
assumption, and is computationally efficient. Therefore, it is suitable for
large-scale prediction of complex spatio-temporal processes. We apply our
method to monthly $PM_{2.5}$ data at more than $200,000$ space-time locations
from January 1999 to December 2022 for fast imputation of missing values and
forecasts with uncertainties.
Related papers
- Amortized Bayesian Local Interpolation NetworK: Fast covariance parameter estimation for Gaussian Processes [0.04660328753262073]
We propose an Amortized Bayesian Local Interpolation NetworK for fast covariance parameter estimation.
The fast prediction time of these networks allows us to bypass the matrix inversion step, creating large computational speedups.
We show significant increases in computational efficiency over comparable scalable GP methodology.
arXiv Detail & Related papers (2024-11-10T01:26:16Z) - Efficient Bayesian Learning Curve Extrapolation using Prior-Data Fitted
Networks [44.294078238444996]
We describe the first application of prior-data fitted neural networks (PFNs) in this context.
We demonstrate that LC-PFN can approximate the posterior predictive distribution more accurately than MCMC.
We also show that the same LC-PFN achieves competitive performance extrapolating a total of 20 000 real learning curves.
arXiv Detail & Related papers (2023-10-31T13:30:30Z) - Bivariate DeepKriging for Large-scale Spatial Interpolation of Wind Fields [2.586710925821896]
High spatial resolution wind data are essential for a wide range of applications in climate, oceanographic and meteorological studies.
Large-scale spatial computation or downscaling of bivariate wind fields having velocity in two dimensions is a challenging task.
In this paper, we propose a method, called bivariate DeepKriging, which is a spatially dependent deep neural network (DNN) with an embedding layer constructed by spatial radial basis functions.
We demonstrate the computational efficiency and scalability of the proposed DNN model, with computations that are, on average, 20 times faster than those of conventional techniques.
arXiv Detail & Related papers (2023-07-16T13:34:44Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Combining Pseudo-Point and State Space Approximations for Sum-Separable
Gaussian Processes [48.64129867897491]
We show that there is a simple and elegant way to combine pseudo-point methods with the state space GP approximation framework to get the best of both worlds.
We demonstrate that the combined approach is more scalable and applicable to a greater range of epidemiology--temporal problems than either method on its own.
arXiv Detail & Related papers (2021-06-18T16:30:09Z) - Learning Functional Priors and Posteriors from Data and Physics [3.537267195871802]
We develop a new framework based on deep neural networks to be able to extrapolate in space-time using historical data.
We employ the physics-informed Generative Adversarial Networks (PI-GAN) to learn a functional prior.
At the second stage, we employ the Hamiltonian Monte Carlo (HMC) method to estimate the posterior in the latent space of PI-GANs.
arXiv Detail & Related papers (2021-06-08T03:03:24Z) - Exploring the Uncertainty Properties of Neural Networks' Implicit Priors
in the Infinite-Width Limit [47.324627920761685]
We use recent theoretical advances that characterize the function-space prior to an ensemble of infinitely-wide NNs as a Gaussian process.
This gives us a better understanding of the implicit prior NNs place on function space.
We also examine the calibration of previous approaches to classification with the NNGP.
arXiv Detail & Related papers (2020-10-14T18:41:54Z) - Improving predictions of Bayesian neural nets via local linearization [79.21517734364093]
We argue that the Gauss-Newton approximation should be understood as a local linearization of the underlying Bayesian neural network (BNN)
Because we use this linearized model for posterior inference, we should also predict using this modified model instead of the original one.
We refer to this modified predictive as "GLM predictive" and show that it effectively resolves common underfitting problems of the Laplace approximation.
arXiv Detail & Related papers (2020-08-19T12:35:55Z) - DeepKriging: Spatially Dependent Deep Neural Networks for Spatial
Prediction [2.219504240642369]
In spatial statistics, a common objective is to predict values of a spatial process at unobserved locations by exploiting spatial dependence.
DeepKriging method has a direct link to Kriging in the Gaussian case, and it has multiple advantages over Kriging for non-Gaussian and non-stationary data.
We apply the method to predicting PM2.5 concentrations across the continental United States.
arXiv Detail & Related papers (2020-07-23T12:38:53Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.