Learning Time-Varying Graphs from Online Data
- URL: http://arxiv.org/abs/2110.11017v1
- Date: Thu, 21 Oct 2021 09:46:44 GMT
- Title: Learning Time-Varying Graphs from Online Data
- Authors: Alberto Natali, Elvin Isufi, Mario Coutino, Geert Leus
- Abstract summary: This work proposes an algorithmic framework to learn time-varying graphs from online data.
It renders it model-independent, i.e., it can be theoretically analyzed in its abstract formulation.
We specialize the framework to three well-known graph learning models, namely, the Gaussian graphical model (GGM), the structural equation model (SEM), and the smoothness-based model (SBM)
- Score: 39.21234914444073
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work proposes an algorithmic framework to learn time-varying graphs from
online data. The generality offered by the framework renders it
model-independent, i.e., it can be theoretically analyzed in its abstract
formulation and then instantiated under a variety of model-dependent graph
learning problems. This is possible by phrasing (time-varying) graph learning
as a composite optimization problem, where different functions regulate
different desiderata, e.g., data fidelity, sparsity or smoothness. Instrumental
for the findings is recognizing that the dependence of the majority (if not
all) data-driven graph learning algorithms on the data is exerted through the
empirical covariance matrix, representing a sufficient statistic for the
estimation problem. Its user-defined recursive update enables the framework to
work in non-stationary environments, while iterative algorithms building on
novel time-varying optimization tools explicitly take into account the temporal
dynamics, speeding up convergence and implicitly including a
temporal-regularization of the solution. We specialize the framework to three
well-known graph learning models, namely, the Gaussian graphical model (GGM),
the structural equation model (SEM), and the smoothness-based model (SBM),
where we also introduce ad-hoc vectorization schemes for structured matrices
(symmetric, hollows, etc.) which are crucial to perform correct gradient
computations, other than enabling to work in low-dimensional vector spaces and
hence easing storage requirements. After discussing the theoretical guarantees
of the proposed framework, we corroborate it with extensive numerical tests in
synthetic and real data.
Related papers
- On Discriminative Probabilistic Modeling for Self-Supervised Representation Learning [85.75164588939185]
We study the discriminative probabilistic modeling problem on a continuous domain for (multimodal) self-supervised representation learning.
We conduct generalization error analysis to reveal the limitation of current InfoNCE-based contrastive loss for self-supervised representation learning.
arXiv Detail & Related papers (2024-10-11T18:02:46Z) - Sparse Graphical Linear Dynamical Systems [1.6635799895254402]
Time-series datasets are central in machine learning with applications in numerous fields of science and engineering.
This work proposes a novel approach to bridge the gap by introducing a joint graphical modeling framework.
We present DGLASSO, a new inference method within this framework that implements an efficient block alternating majorization-minimization algorithm.
arXiv Detail & Related papers (2023-07-06T14:10:02Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Interpretable and Scalable Graphical Models for Complex Spatio-temporal
Processes [3.469001874498102]
thesis focuses on data that has complex-temporal structure and on probabilistic graphical models that learn the structure in an interpretable and interpretable manner.
practical applications of the methodology are considered using real datasets.
This includes brain-connectivity analysis using data, space weather forecasting using solar imaging data, longitudinal analysis of public opinions using Twitter data, and mining of mental health related issues using TalkLife data.
arXiv Detail & Related papers (2023-01-15T05:39:30Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Learning Sparse and Continuous Graph Structures for Multivariate Time
Series Forecasting [5.359968374560132]
Learning Sparse and Continuous Graphs for Forecasting (LSCGF) is a novel deep learning model that joins graph learning and forecasting.
In this paper, we propose a brand new method named Smooth Sparse Unit (SSU) to learn sparse and continuous graph adjacency matrix.
Our model achieves state-of-the-art performances with minor trainable parameters.
arXiv Detail & Related papers (2022-01-24T13:35:37Z) - Deep Efficient Continuous Manifold Learning for Time Series Modeling [11.876985348588477]
A symmetric positive definite matrix is being studied in computer vision, signal processing, and medical image analysis.
In this paper, we propose a framework to exploit a diffeomorphism mapping between Riemannian manifold and a Cholesky space.
For dynamic modeling of time-series data, we devise a continuous manifold learning method by systematically integrating a manifold ordinary differential equation and a gated recurrent neural network.
arXiv Detail & Related papers (2021-12-03T01:38:38Z) - Efficient Variational Bayesian Structure Learning of Dynamic Graphical
Models [19.591265962713837]
Estimating time-varying graphical models is of paramount importance in various social, financial, biological, and engineering systems.
Existing methods require extensive tuning of parameters that control the graph sparsity and temporal smoothness.
We propose a low-complexity tuning-free Bayesian approach, named BADGE.
arXiv Detail & Related papers (2020-09-16T14:19:23Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.