Continuous-Time Bayesian Networks with Clocks
- URL: http://arxiv.org/abs/2007.00347v2
- Date: Thu, 2 Jul 2020 19:16:47 GMT
- Title: Continuous-Time Bayesian Networks with Clocks
- Authors: Nicolai Engelmann, Dominik Linzner, Heinz Koeppl
- Abstract summary: We introduce a set of node-wise clocks to construct a collection of graph-coupled semi-Markov chains.
We provide algorithms for parameter and structure inference, which make use of local dependencies.
- Score: 33.774970857450086
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Structured stochastic processes evolving in continuous time present a widely
adopted framework to model phenomena occurring in nature and engineering.
However, such models are often chosen to satisfy the Markov property to
maintain tractability. One of the more popular of such memoryless models are
Continuous Time Bayesian Networks (CTBNs). In this work, we lift its
restriction to exponential survival times to arbitrary distributions. Current
extensions achieve this via auxiliary states, which hinder tractability. To
avoid that, we introduce a set of node-wise clocks to construct a collection of
graph-coupled semi-Markov chains. We provide algorithms for parameter and
structure inference, which make use of local dependencies and conduct
experiments on synthetic data and a data-set generated through a benchmark tool
for gene regulatory networks. In doing so, we point out advantages compared to
current CTBN extensions.
Related papers
- ChronoGAN: Supervised and Embedded Generative Adversarial Networks for Time Series Generation [0.9374652839580181]
We introduce a robust framework aimed at addressing and mitigating these issues effectively.
This framework integrates the benefits of an Autoencoder-generated embedding space with the adversarial training dynamics of GANs.
We introduce an early generation algorithm and an improved neural network architecture to enhance stability and ensure effective generalization across both short and long time series.
arXiv Detail & Related papers (2024-09-21T04:51:35Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Learning Time-aware Graph Structures for Spatially Correlated Time
Series Forecasting [30.93275270960829]
We propose Time-aware Graph Structure Learning (TagSL), which extracts time-aware correlations among time series.
We also present a Graph Convolution-based Gated Recurrent Unit (GCGRU), that jointly captures spatial and temporal dependencies.
Finally, we introduce a unified framework named Time-aware Graph Convolutional Recurrent Network (TGCRN), combining TagSL, GCGRU in an encoder-decoder architecture for multi-step-temporal forecasting.
arXiv Detail & Related papers (2023-12-27T04:23:43Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Graph-Survival: A Survival Analysis Framework for Machine Learning on
Temporal Networks [14.430635608400982]
We propose a framework for designing generative models for continuous time temporal networks.
We propose a fitting method for models within this framework, and an algorithm for simulating new temporal networks having desired properties.
arXiv Detail & Related papers (2022-03-14T16:40:57Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Continuous Latent Process Flows [47.267251969492484]
Partial observations of continuous time-series dynamics at arbitrary time stamps exist in many disciplines. Fitting this type of data using statistical models with continuous dynamics is not only promising at an intuitive level but also has practical benefits.
We tackle these challenges with continuous latent process flows (CLPF), a principled architecture decoding continuous latent processes into continuous observable processes using a time-dependent normalizing flow driven by a differential equation.
Our ablation studies demonstrate the effectiveness of our contributions in various inference tasks on irregular time grids.
arXiv Detail & Related papers (2021-06-29T17:16:04Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - Propagation for Dynamic Continuous Time Chain Event Graphs [0.0]
We present a tractable exact inferential scheme analogous to the scheme in Kjaerulff (1992) for discrete Dynamic Bayesian Networks (DBNs)
We show that the CT-DCEG is preferred to DBNs and continuous time BNs under contexts involving significant asymmetry and a natural total ordering of the process evolution.
arXiv Detail & Related papers (2020-06-29T08:24:57Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.