A Variational Autoencoder for Neural Temporal Point Processes with
Dynamic Latent Graphs
- URL: http://arxiv.org/abs/2312.16083v2
- Date: Fri, 8 Mar 2024 02:41:37 GMT
- Title: A Variational Autoencoder for Neural Temporal Point Processes with
Dynamic Latent Graphs
- Authors: Sikun Yang, Hongyuan Zha
- Abstract summary: We propose a novel variational auto-encoder to capture such a mixture of temporal dynamics.
The model predicts the future event times, by using the learned dependency graph to remove the noncontributing influences of past events.
- Score: 45.98786737053273
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Continuously-observed event occurrences, often exhibit self- and
mutually-exciting effects, which can be well modeled using temporal point
processes. Beyond that, these event dynamics may also change over time, with
certain periodic trends. We propose a novel variational auto-encoder to capture
such a mixture of temporal dynamics. More specifically, the whole time interval
of the input sequence is partitioned into a set of sub-intervals. The event
dynamics are assumed to be stationary within each sub-interval, but could be
changing across those sub-intervals. In particular, we use a sequential latent
variable model to learn a dependency graph between the observed dimensions, for
each sub-interval. The model predicts the future event times, by using the
learned dependency graph to remove the noncontributing influences of past
events. By doing so, the proposed model demonstrates its higher accuracy in
predicting inter-event times and event types for several real-world event
sequences, compared with existing state of the art neural point processes.
Related papers
- PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Motion Code: Robust Time Series Classification and Forecasting via Sparse Variational Multi-Stochastic Processes Learning [3.2857981869020327]
We propose a novel framework that views each time series as a realization of a continuous-time process.
This mathematical approach captures dependencies across timestamps and detects hidden, time-varying signals within the noise.
Experiments on noisy datasets, including real-world Parkinson's disease sensor tracking, demonstrate Motion Code's strong performance against established benchmarks.
arXiv Detail & Related papers (2024-02-21T19:10:08Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Interacting Diffusion Processes for Event Sequence Forecasting [20.380620709345898]
We introduce a novel approach that incorporates a diffusion generative model.
The model facilitates sequence-to-sequence prediction, allowing multi-step predictions based on historical event sequences.
We demonstrate that our proposal outperforms state-of-the-art baselines for long-horizon forecasting of TPP.
arXiv Detail & Related papers (2023-10-26T22:17:25Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Variational Neural Temporal Point Process [22.396329275957996]
A temporal point process is a process that predicts which type of events is likely to happen and when the event will occur.
We introduce the inference and the generative networks, and train a distribution of latent variable to deal with property on deep neural network.
We empirically demonstrate that our model can generalize the representations of various event types.
arXiv Detail & Related papers (2022-02-17T13:34:30Z) - Long-Range Transformers for Dynamic Spatiotemporal Forecasting [16.37467119526305]
Methods based on graph neural networks explicitly model variable relationships.
Long-Range Transformers can learn interactions between time, value, and information jointly along this extended sequence.
arXiv Detail & Related papers (2021-09-24T22:11:46Z) - Event2Graph: Event-driven Bipartite Graph for Multivariate Time-series
Anomaly Detection [25.832983667044708]
We propose a dynamic bipartite graph structure to encode the inter-dependencies between time-series.
Based on this design, relations between time series can be explicitly modelled via dynamic connections to event nodes.
arXiv Detail & Related papers (2021-08-15T17:50:37Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.