Graph Gamma Process Generalized Linear Dynamical Systems
- URL: http://arxiv.org/abs/2007.12852v1
- Date: Sat, 25 Jul 2020 04:16:34 GMT
- Title: Graph Gamma Process Generalized Linear Dynamical Systems
- Authors: Rahi Kalantari and Mingyuan Zhou
- Abstract summary: We introduce graph gamma process (GGP) linear dynamical systems to model real multivariate time series.
For temporal pattern discovery, the latent representation under the model is used to decompose the time series into a parsimonious set of multivariate sub-sequences.
We use the generated random graph, whose number of nonzero-degree nodes is finite, to define both the sparsity pattern and dimension of the latent state transition matrix.
- Score: 60.467040479276704
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce graph gamma process (GGP) linear dynamical systems to model
real-valued multivariate time series. For temporal pattern discovery, the
latent representation under the model is used to decompose the time series into
a parsimonious set of multivariate sub-sequences. In each sub-sequence,
different data dimensions often share similar temporal patterns but may exhibit
distinct magnitudes, and hence allowing the superposition of all sub-sequences
to exhibit diverse behaviors at different data dimensions. We further
generalize the proposed model by replacing the Gaussian observation layer with
the negative binomial distribution to model multivariate count time series.
Generated from the proposed GGP is an infinite dimensional directed sparse
random graph, which is constructed by taking the logical OR operation of
countably infinite binary adjacency matrices that share the same set of
countably infinite nodes. Each of these adjacency matrices is associated with a
weight to indicate its activation strength, and places a finite number of edges
between a finite subset of nodes belonging to the same node community. We use
the generated random graph, whose number of nonzero-degree nodes is finite, to
define both the sparsity pattern and dimension of the latent state transition
matrix of a (generalized) linear dynamical system. The activation strength of
each node community relative to the overall activation strength is used to
extract a multivariate sub-sequence, revealing the data pattern captured by the
corresponding community. On both synthetic and real-world time series, the
proposed nonparametric Bayesian dynamic models, which are initialized at
random, consistently exhibit good predictive performance in comparison to a
variety of baseline models, revealing interpretable latent state transition
patterns and decomposing the time series into distinctly behaved sub-sequences.
Related papers
- Nonlinear time-series embedding by monotone variational inequality [6.992239210938067]
We introduce a new method to learn low-dimensional representations of nonlinear time series without supervision.
The learned representation can be used for downstream machine-learning tasks such as clustering and classification.
arXiv Detail & Related papers (2024-06-11T02:19:31Z) - ForecastGrapher: Redefining Multivariate Time Series Forecasting with Graph Neural Networks [9.006068771300377]
We present ForecastGrapher, a framework for capturing the intricate temporal dynamics and inter-series correlations.
Our approach is underpinned by three pivotal steps: generating custom node embeddings to reflect the temporal variations within each series; constructing an adaptive adjacency matrix to encode the inter-series correlations; and thirdly, augmenting the GNNs' expressive power by diversifying the node feature distribution.
arXiv Detail & Related papers (2024-05-28T10:40:20Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Analyzing Unaligned Multimodal Sequence via Graph Convolution and Graph
Pooling Fusion [28.077474663199062]
We propose a novel model, termed Multimodal Graph, to investigate the effectiveness of graph neural networks (GNN) on modeling multimodal sequential data.
Our graph-based model reaches state-of-the-art performance on two benchmark datasets.
arXiv Detail & Related papers (2020-11-27T06:12:14Z) - The multilayer random dot product graph [6.722870980553432]
We present a comprehensive extension of the latent position network model known as the random dot product graph.
We propose a method for jointly embedding submatrices into a suitable latent space.
Empirical improvements in link prediction over single graph embeddings are exhibited in a cyber-security example.
arXiv Detail & Related papers (2020-07-20T20:31:39Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.