Temporal Graph Signal Decomposition
- URL: http://arxiv.org/abs/2106.13517v1
- Date: Fri, 25 Jun 2021 09:19:15 GMT
- Title: Temporal Graph Signal Decomposition
- Authors: Maxwell McNeil and Lin Zhang and Petko Bogdanov
- Abstract summary: We propose a general, dictionary-based framework for temporal graph signal decomposition (TGSD)
The key idea is to learn a low-rank, joint encoding of the data via a combination of graph and time dictionaries.
Our framework achieves 28% reduction in RMSE compared to baselines for temporal when as many as 75% of the observations are missing.
- Score: 12.996990722929294
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal graph signals are multivariate time series with individual
components associated with nodes of a fixed graph structure. Data of this kind
arises in many domains including activity of social network users, sensor
network readings over time, and time course gene expression within the
interaction network of a model organism. Traditional matrix decomposition
methods applied to such data fall short of exploiting structural regularities
encoded in the underlying graph and also in the temporal patterns of the
signal. How can we take into account such structure to obtain a succinct and
interpretable representation of temporal graph signals?
We propose a general, dictionary-based framework for temporal graph signal
decomposition (TGSD). The key idea is to learn a low-rank, joint encoding of
the data via a combination of graph and time dictionaries. We propose a highly
scalable decomposition algorithm for both complete and incomplete data, and
demonstrate its advantage for matrix decomposition, imputation of missing
values, temporal interpolation, clustering, period estimation, and rank
estimation in synthetic and real-world data ranging from traffic patterns to
social media activity. Our framework achieves 28% reduction in RMSE compared to
baselines for temporal interpolation when as many as 75% of the observations
are missing. It scales best among baselines taking under 20 seconds on 3.5
million data points and produces the most parsimonious models. To the best of
our knowledge, TGSD is the first framework to jointly model graph signals by
temporal and graph dictionaries.
Related papers
- Beyond Message Passing: Neural Graph Pattern Machine [50.78679002846741]
We introduce the Neural Graph Pattern Machine (GPM), a novel framework that bypasses message passing by learning directly from graph substructures.<n>GPM efficiently extracts, encodes, and prioritizes task-relevant graph patterns, offering greater expressivity and improved ability to capture long-range dependencies.
arXiv Detail & Related papers (2025-01-30T20:37:47Z) - Graph-Dictionary Signal Model for Sparse Representations of Multivariate Data [49.77103348208835]
We define a novel Graph-Dictionary signal model, where a finite set of graphs characterizes relationships in data distribution through a weighted sum of their Laplacians.
We propose a framework to infer the graph dictionary representation from observed data, along with a bilinear generalization of the primal-dual splitting algorithm to solve the learning problem.
We exploit graph-dictionary representations in a motor imagery decoding task on brain activity data, where we classify imagined motion better than standard methods.
arXiv Detail & Related papers (2024-11-08T17:40:43Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Sparsity exploitation via discovering graphical models in multi-variate
time-series forecasting [1.2762298148425795]
We propose a decoupled training method, which includes a graph generating module and a GNNs forecasting module.
First, we use Graphical Lasso (or GraphLASSO) to directly exploit the sparsity pattern from data to build graph structures.
Second, we fit these graph structures and the input data into a Graph Convolutional Recurrent Network (GCRN) to train a forecasting model.
arXiv Detail & Related papers (2023-06-29T16:48:00Z) - Networked Time Series Imputation via Position-aware Graph Enhanced
Variational Autoencoders [31.953958053709805]
We design a new model named PoGeVon which leverages variational autoencoder (VAE) to predict missing values over both node time series features and graph structures.
Experiment results demonstrate the effectiveness of our model over baselines.
arXiv Detail & Related papers (2023-05-29T21:11:34Z) - A fast topological approach for predicting anomalies in time-varying
graphs [0.0]
A persistence diagram (PD) from topological data analysis (TDA) has become a popular descriptor of shape of data with a well-defined distance between points.
This paper introduces a computationally efficient framework to extract shape information from graph data.
In a real data application, our approach provides up to 22% gain in anomalous price prediction for the cryptocurrency transaction networks.
arXiv Detail & Related papers (2023-05-11T01:54:45Z) - TranSG: Transformer-Based Skeleton Graph Prototype Contrastive Learning
with Structure-Trajectory Prompted Reconstruction for Person
Re-Identification [63.903237777588316]
Person re-identification (re-ID) via 3D skeleton data is an emerging topic with prominent advantages.
Existing methods usually design skeleton descriptors with raw body joints or perform skeleton sequence representation learning.
We propose a generic Transformer-based Skeleton Graph prototype contrastive learning (TranSG) approach with structure-trajectory prompted reconstruction.
arXiv Detail & Related papers (2023-03-13T02:27:45Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Learning to Reconstruct Missing Data from Spatiotemporal Graphs with
Sparse Observations [11.486068333583216]
This paper tackles the problem of learning effective models to reconstruct missing data points.
We propose a class of attention-based architectures, that given a set of highly sparse observations, learn a representation for points in time and space.
Compared to the state of the art, our model handles sparse data without propagating prediction errors or requiring a bidirectional model to encode forward and backward time dependencies.
arXiv Detail & Related papers (2022-05-26T16:40:48Z) - Graph-in-Graph (GiG): Learning interpretable latent graphs in
non-Euclidean domain for biological and healthcare applications [52.65389473899139]
Graphs are a powerful tool for representing and analyzing unstructured, non-Euclidean data ubiquitous in the healthcare domain.
Recent works have shown that considering relationships between input data samples have a positive regularizing effect for the downstream task.
We propose Graph-in-Graph (GiG), a neural network architecture for protein classification and brain imaging applications.
arXiv Detail & Related papers (2022-04-01T10:01:37Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.