Deep Representation Learning for Prediction of Temporal Event Sets in
the Continuous Time Domain
- URL: http://arxiv.org/abs/2309.17009v1
- Date: Fri, 29 Sep 2023 06:46:31 GMT
- Title: Deep Representation Learning for Prediction of Temporal Event Sets in
the Continuous Time Domain
- Authors: Parag Dutta, Kawin Mayilvaghanan, Pratyaksha Sinha, Ambedkar Dukkipati
- Abstract summary: Temporal Point Processes play an important role in predicting or forecasting events.
We propose a scalable and efficient approach based on TPPs to solve this problem.
- Score: 9.71405768795797
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal Point Processes (TPP) play an important role in predicting or
forecasting events. Although these problems have been studied extensively,
predicting multiple simultaneously occurring events can be challenging. For
instance, more often than not, a patient gets admitted to a hospital with
multiple conditions at a time. Similarly people buy more than one stock and
multiple news breaks out at the same time. Moreover, these events do not occur
at discrete time intervals, and forecasting event sets in the continuous time
domain remains an open problem. Naive approaches for extending the existing TPP
models for solving this problem lead to dealing with an exponentially large
number of events or ignoring set dependencies among events. In this work, we
propose a scalable and efficient approach based on TPPs to solve this problem.
Our proposed approach incorporates contextual event embeddings, temporal
information, and domain features to model the temporal event sets. We
demonstrate the effectiveness of our approach through extensive experiments on
multiple datasets, showing that our model outperforms existing methods in terms
of prediction metrics and computational efficiency. To the best of our
knowledge, this is the first work that solves the problem of predicting event
set intensities in the continuous time domain by using TPPs.
Related papers
- XTSFormer: Cross-Temporal-Scale Transformer for Irregular Time Event
Prediction [9.240950990926796]
Event prediction aims to forecast the time and type of a future event based on a historical event sequence.
Despite its significance, several challenges exist, including the irregularity of time intervals between consecutive events, the existence of cycles, periodicity, and multi-scale event interactions.
arXiv Detail & Related papers (2024-02-03T20:33:39Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - HyperHawkes: Hypernetwork based Neural Temporal Point Process [5.607676459156789]
Temporal point process serves as an essential tool for modeling time-to-event data in continuous time space.
It is not generalizable to predict events from unseen sequences in dynamic environment.
We propose textitHyperHawkes, a hypernetwork based temporal point process framework.
arXiv Detail & Related papers (2022-10-01T07:14:19Z) - Modeling Continuous Time Sequences with Intermittent Observations using
Marked Temporal Point Processes [25.074394338483575]
A large fraction of data generated via human activities can be represented as a sequence of events over a continuous-time.
Deep learning models over these continuous-time event sequences is a non-trivial task.
In this work, we provide a novel unsupervised model and inference method for learning MTPP in presence of event sequences with missing events.
arXiv Detail & Related papers (2022-06-23T18:23:20Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Variational Neural Temporal Point Process [22.396329275957996]
A temporal point process is a process that predicts which type of events is likely to happen and when the event will occur.
We introduce the inference and the generative networks, and train a distribution of latent variable to deal with property on deep neural network.
We empirically demonstrate that our model can generalize the representations of various event types.
arXiv Detail & Related papers (2022-02-17T13:34:30Z) - Learning Neural Models for Continuous-Time Sequences [0.0]
We study the properties of continuous-time event sequences (CTES) and design robust yet scalable neural network-based models to overcome the aforementioned problems.
In this work, we model the underlying generative distribution of events using marked temporal point processes (MTPP) to address a wide range of real-world problems.
arXiv Detail & Related papers (2021-11-13T20:39:15Z) - Interval-censored Hawkes processes [82.87738318505582]
We propose a model to estimate the parameters of a Hawkes process in interval-censored settings.
We show how a non-homogeneous approximation to the Hawkes admits a tractable likelihood in the interval-censored setting.
arXiv Detail & Related papers (2021-04-16T07:29:04Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.