Learned Factor Graphs for Inference from Stationary Time Sequences
- URL: http://arxiv.org/abs/2006.03258v4
- Date: Fri, 24 Dec 2021 10:26:05 GMT
- Title: Learned Factor Graphs for Inference from Stationary Time Sequences
- Authors: Nir Shlezinger, Nariman Farsad, Yonina C. Eldar, and Andrea J.
Goldsmith
- Abstract summary: We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
- Score: 107.63351413549992
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The design of methods for inference from time sequences has traditionally
relied on statistical models that describe the relation between a latent
desired sequence and the observed one. A broad family of model-based algorithms
have been derived to carry out inference at controllable complexity using
recursive computations over the factor graph representing the underlying
distribution. An alternative model-agnostic approach utilizes machine learning
(ML) methods. Here we propose a framework that combines model-based algorithms
and data-driven ML tools for stationary time sequences. In the proposed
approach, neural networks are developed to separately learn specific components
of a factor graph describing the distribution of the time sequence, rather than
the complete inference task. By exploiting stationary properties of this
distribution, the resulting approach can be applied to sequences of varying
temporal duration. Learned factor graph can be realized using compact neural
networks that are trainable using small training sets, or alternatively, be
used to improve upon existing deep inference systems. We present an inference
algorithm based on learned stationary factor graphs, which learns to implement
the sum-product scheme from labeled data, and can be applied to sequences of
different lengths. Our experimental results demonstrate the ability of the
proposed learned factor graphs to learn to carry out accurate inference from
small training sets for sleep stage detection using the Sleep-EDF dataset, as
well as for symbol detection in digital communications with unknown channels.
Related papers
- Graph Neural Flows for Unveiling Systemic Interactions Among Irregularly Sampled Time Series [5.460420960898444]
We develop a graph-based model that unveils the systemic interactions of time series observed at irregular time points.
We validate our approach on several tasks, including time series classification and forecasting, to demonstrate its efficacy.
arXiv Detail & Related papers (2024-10-17T21:10:39Z) - Learning Signal Temporal Logic through Neural Network for Interpretable
Classification [13.829082181692872]
We propose an explainable neural-symbolic framework for the classification of time-series behaviors.
We demonstrate the computational efficiency, compactness, and interpretability of the proposed method through driving scenarios and naval surveillance case studies.
arXiv Detail & Related papers (2022-10-04T21:11:54Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Discrete Graph Structure Learning for Forecasting Multiple Time Series [14.459541930646205]
Time series forecasting is an extensively studied subject in statistics, economics, and computer science.
In this work, we propose learning the structure simultaneously with a graph neural network (GNN) if the graph is unknown.
Empirical evaluations show that our method is simpler, more efficient, and better performing than a recently proposed bilevel learning approach for graph structure learning.
arXiv Detail & Related papers (2021-01-18T03:36:33Z) - Remaining Useful Life Estimation Under Uncertainty with Causal GraphNets [0.0]
A novel approach for the construction and training of time series models is presented.
The proposed method is appropriate for constructing predictive models for non-stationary time series.
arXiv Detail & Related papers (2020-11-23T21:28:03Z) - Efficient Variational Bayesian Structure Learning of Dynamic Graphical
Models [19.591265962713837]
Estimating time-varying graphical models is of paramount importance in various social, financial, biological, and engineering systems.
Existing methods require extensive tuning of parameters that control the graph sparsity and temporal smoothness.
We propose a low-complexity tuning-free Bayesian approach, named BADGE.
arXiv Detail & Related papers (2020-09-16T14:19:23Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Data-Driven Factor Graphs for Deep Symbol Detection [107.63351413549992]
We propose to implement factor graph methods in a data-driven manner.
In particular, we propose to use machine learning (ML) tools to learn the factor graph.
We demonstrate that the proposed system, referred to as BCJRNet, learns to implement the BCJR algorithm from a small training set.
arXiv Detail & Related papers (2020-01-31T09:23:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.