Bayesian Complementary Kernelized Learning for Multidimensional
Spatiotemporal Data
- URL: http://arxiv.org/abs/2208.09978v2
- Date: Tue, 30 May 2023 20:29:54 GMT
- Title: Bayesian Complementary Kernelized Learning for Multidimensional
Spatiotemporal Data
- Authors: Mengying Lei, Aurelie Labbe, Lijun Sun
- Abstract summary: We propose a new statistical framework -- Complementary Complementary Kernelized Learning (BCKL)
BCKL offers superior performance in providing accurate posterior mean and high-quality uncertainty estimates.
- Score: 11.763229353978321
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Probabilistic modeling of multidimensional spatiotemporal data is critical to
many real-world applications. As real-world spatiotemporal data often exhibits
complex dependencies that are nonstationary and nonseparable, developing
effective and computationally efficient statistical models to accommodate
nonstationary/nonseparable processes containing both long-range and short-scale
variations becomes a challenging task, in particular for large-scale datasets
with various corruption/missing structures. In this paper, we propose a new
statistical framework -- Bayesian Complementary Kernelized Learning (BCKL) --
to achieve scalable probabilistic modeling for multidimensional spatiotemporal
data. To effectively characterize complex dependencies, BCKL integrates two
complementary approaches -- kernelized low-rank tensor factorization and
short-range spatiotemporal Gaussian Processes. Specifically, we use a
multi-linear low-rank factorization component to capture the global/long-range
correlations in the data and introduce an additive short-scale GP based on
compactly supported kernel functions to characterize the remaining local
variabilities. We develop an efficient Markov chain Monte Carlo (MCMC)
algorithm for model inference and evaluate the proposed BCKL framework on both
synthetic and real-world spatiotemporal datasets. Our experiment results show
that BCKL offers superior performance in providing accurate posterior mean and
high-quality uncertainty estimates, confirming the importance of both global
and local components in modeling spatiotemporal data.
Related papers
- TS-CausalNN: Learning Temporal Causal Relations from Non-linear Non-stationary Time Series Data [0.42156176975445486]
We propose a Time-Series Causal Neural Network (TS-CausalNN) to discover contemporaneous and lagged causal relations simultaneously.
In addition to the simple parallel design, an advantage of the proposed model is that it naturally handles the non-stationarity and non-linearity of the data.
arXiv Detail & Related papers (2024-04-01T20:33:29Z) - Latent Semantic Consensus For Deterministic Geometric Model Fitting [109.44565542031384]
We propose an effective method called Latent Semantic Consensus (LSC)
LSC formulates the model fitting problem into two latent semantic spaces based on data points and model hypotheses.
LSC is able to provide consistent and reliable solutions within only a few milliseconds for general multi-structural model fitting.
arXiv Detail & Related papers (2024-03-11T05:35:38Z) - Sample Complexity Characterization for Linear Contextual MDPs [67.79455646673762]
Contextual decision processes (CMDPs) describe a class of reinforcement learning problems in which the transition kernels and reward functions can change over time with different MDPs indexed by a context variable.
CMDPs serve as an important framework to model many real-world applications with time-varying environments.
We study CMDPs under two linear function approximation models: Model I with context-varying representations and common linear weights for all contexts; and Model II with common representations for all contexts and context-varying linear weights.
arXiv Detail & Related papers (2024-02-05T03:25:04Z) - Perceiver-based CDF Modeling for Time Series Forecasting [25.26713741799865]
We propose a new architecture, called perceiver-CDF, for modeling cumulative distribution functions (CDF) of time series data.
Our approach combines the perceiver architecture with a copula-based attention mechanism tailored for multimodal time series prediction.
Experiments on the unimodal and multimodal benchmarks consistently demonstrate a 20% improvement over state-of-the-art methods.
arXiv Detail & Related papers (2023-10-03T01:13:17Z) - Kernel-based Joint Independence Tests for Multivariate Stationary and
Non-stationary Time Series [0.6749750044497732]
We introduce kernel-based statistical tests of joint independence in multivariate time series.
We show how the method robustly uncovers significant higher-order dependencies in synthetic examples.
Our method can aid in uncovering high-order interactions in data.
arXiv Detail & Related papers (2023-05-15T10:38:24Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - PIETS: Parallelised Irregularity Encoders for Forecasting with
Heterogeneous Time-Series [5.911865723926626]
Heterogeneity and irregularity of multi-source data sets present a significant challenge to time-series analysis.
In this work, we design a novel architecture, PIETS, to model heterogeneous time-series.
We show that PIETS is able to effectively model heterogeneous temporal data and outperforms other state-of-the-art approaches in the prediction task.
arXiv Detail & Related papers (2021-09-30T20:01:19Z) - Scalable Spatiotemporally Varying Coefficient Modelling with Bayesian Kernelized Tensor Regression [17.158289775348063]
Kernelized tensor Regression (BKTR) can be considered a new and scalable approach to modeling processes with low-rank cotemporal structure.
We conduct extensive experiments on both synthetic and real-world data sets, and our results confirm the superior performance and efficiency of BKTR for model estimation and inference.
arXiv Detail & Related papers (2021-08-31T19:22:23Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Nonparametric Estimation in the Dynamic Bradley-Terry Model [69.70604365861121]
We develop a novel estimator that relies on kernel smoothing to pre-process the pairwise comparisons over time.
We derive time-varying oracle bounds for both the estimation error and the excess risk in the model-agnostic setting.
arXiv Detail & Related papers (2020-02-28T21:52:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.