Wasserstein multivariate auto-regressive models for modeling
distributional time series and its application in graph learning
- URL: http://arxiv.org/abs/2207.05442v2
- Date: Sat, 6 May 2023 19:39:32 GMT
- Title: Wasserstein multivariate auto-regressive models for modeling
distributional time series and its application in graph learning
- Authors: Yiye Jiang
- Abstract summary: We propose a new auto-regressive model for the statistical analysis of multivariate distributional time series.
Results on the existence, uniqueness and stationarity of the solution of such a model are provided.
In addition to the analysis of simulated data, the proposed model is illustrated with two real data sets made of observations from age distribution in different countries.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new auto-regressive model for the statistical analysis of
multivariate distributional time series. The data of interest consist of a
collection of multiple series of probability measures supported over a bounded
interval of the real line, and that are indexed by distinct time instants. The
probability measures are modelled as random objects in the Wasserstein space.
We establish the auto-regressive model in the tangent space at the Lebesgue
measure by first centering all the raw measures so that their Fr\'echet means
turn to be the Lebesgue measure. Using the theory of iterated random function
systems, results on the existence, uniqueness and stationarity of the solution
of such a model are provided. We also propose a consistent estimator for the
model coefficient. In addition to the analysis of simulated data, the proposed
model is illustrated with two real data sets made of observations from age
distribution in different countries and bike sharing network in Paris. Finally,
due to the positive and boundedness constraints that we impose on the model
coefficients, the proposed estimator that is learned under these constraints,
naturally has a sparse structure. The sparsity allows furthermore the
application of the proposed model in learning a graph of temporal dependency
from the multivariate distributional time series.
Related papers
- Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Deep Ensembles Meets Quantile Regression: Uncertainty-aware Imputation
for Time Series [49.992908221544624]
Time series data often exhibit numerous missing values, which is the time series imputation task.
Previous deep learning methods have been shown to be effective for time series imputation.
We propose a non-generative time series imputation method that produces accurate imputations with inherent uncertainty.
arXiv Detail & Related papers (2023-12-03T05:52:30Z) - Generative modeling for time series via Schr{\"o}dinger bridge [0.0]
We propose a novel generative model for time series based on Schr"dinger bridge (SB) approach.
This consists in the entropic via optimal transport between a reference probability measure on path space and a target measure consistent with the joint data distribution of the time series.
arXiv Detail & Related papers (2023-04-11T09:45:06Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Parameterization of state duration in Hidden semi-Markov Models: an
application in electrocardiography [0.0]
We introduce a parametric model for time series pattern recognition and provide a maximum-likelihood estimation of its parameters.
An application on classification reveals the main strengths and weaknesses of each alternative.
arXiv Detail & Related papers (2022-11-17T11:51:35Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Autoregressive Denoising Diffusion Models for Multivariate Probabilistic
Time Series Forecasting [4.1573460459258245]
We use diffusion probabilistic models, a class of latent variable models closely connected to score matching and energy-based methods.
Our model learns gradients by optimizing a variational bound on the data likelihood and at inference time converts white noise into a sample of the distribution of interest.
arXiv Detail & Related papers (2021-01-28T15:46:10Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z) - Predicting Multidimensional Data via Tensor Learning [0.0]
We develop a model that retains the intrinsic multidimensional structure of the dataset.
To estimate the model parameters, an Alternating Least Squares algorithm is developed.
The proposed model is able to outperform benchmark models present in the forecasting literature.
arXiv Detail & Related papers (2020-02-11T11:57:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.