Modeling Continuous Time Sequences with Intermittent Observations using
Marked Temporal Point Processes
- URL: http://arxiv.org/abs/2206.12414v1
- Date: Thu, 23 Jun 2022 18:23:20 GMT
- Title: Modeling Continuous Time Sequences with Intermittent Observations using
Marked Temporal Point Processes
- Authors: Vinayak Gupta and Srikanta Bedathur and Sourangshu Bhattacharya and
Abir De
- Abstract summary: A large fraction of data generated via human activities can be represented as a sequence of events over a continuous-time.
Deep learning models over these continuous-time event sequences is a non-trivial task.
In this work, we provide a novel unsupervised model and inference method for learning MTPP in presence of event sequences with missing events.
- Score: 25.074394338483575
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A large fraction of data generated via human activities such as online
purchases, health records, spatial mobility etc. can be represented as a
sequence of events over a continuous-time. Learning deep learning models over
these continuous-time event sequences is a non-trivial task as it involves
modeling the ever-increasing event timestamps, inter-event time gaps, event
types, and the influences between different events within and across different
sequences. In recent years neural enhancements to marked temporal point
processes (MTPP) have emerged as a powerful framework to model the underlying
generative mechanism of asynchronous events localized in continuous time.
However, most existing models and inference methods in the MTPP framework
consider only the complete observation scenario i.e. the event sequence being
modeled is completely observed with no missing events -- an ideal setting that
is rarely applicable in real-world applications. A recent line of work which
considers missing events while training MTPP utilizes supervised learning
techniques that require additional knowledge of missing or observed label for
each event in a sequence, which further restricts its practicability as in
several scenarios the details of missing events is not known apriori. In this
work, we provide a novel unsupervised model and inference method for learning
MTPP in presence of event sequences with missing events. Specifically, we first
model the generative processes of observed events and missing events using two
MTPP, where the missing events are represented as latent random variables.
Then, we devise an unsupervised training method that jointly learns both the
MTPP by means of variational inference. Such a formulation can effectively
impute the missing data among the observed events and can identify the optimal
position of missing events in a sequence.
Related papers
- Improving Event Definition Following For Zero-Shot Event Detection [66.27883872707523]
Existing approaches on zero-shot event detection usually train models on datasets annotated with known event types.
We aim to improve zero-shot event detection by training models to better follow event definitions.
arXiv Detail & Related papers (2024-03-05T01:46:50Z) - XTSFormer: Cross-Temporal-Scale Transformer for Irregular Time Event
Prediction [9.240950990926796]
Event prediction aims to forecast the time and type of a future event based on a historical event sequence.
Despite its significance, several challenges exist, including the irregularity of time intervals between consecutive events, the existence of cycles, periodicity, and multi-scale event interactions.
arXiv Detail & Related papers (2024-02-03T20:33:39Z) - Enhancing Asynchronous Time Series Forecasting with Contrastive
Relational Inference [21.51753838306655]
Temporal point processes(TPPs) are the standard method for modeling such.
Existing TPP models have focused on the conditional distribution of future events instead of explicitly modeling event interactions, imposing challenges for event predictions.
We propose a novel approach that leverages a Neural Inference (NRI) to learn a graph that infers interactions while simultaneously learning dynamics patterns from observational data.
arXiv Detail & Related papers (2023-09-06T09:47:03Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - HyperHawkes: Hypernetwork based Neural Temporal Point Process [5.607676459156789]
Temporal point process serves as an essential tool for modeling time-to-event data in continuous time space.
It is not generalizable to predict events from unseen sequences in dynamic environment.
We propose textitHyperHawkes, a hypernetwork based temporal point process framework.
arXiv Detail & Related papers (2022-10-01T07:14:19Z) - Unifying Event Detection and Captioning as Sequence Generation via
Pre-Training [53.613265415703815]
We propose a unified pre-training and fine-tuning framework to enhance the inter-task association between event detection and captioning.
Our model outperforms the state-of-the-art methods, and can be further boosted when pre-trained on extra large-scale video-text data.
arXiv Detail & Related papers (2022-07-18T14:18:13Z) - CEP3: Community Event Prediction with Neural Point Process on Graph [59.434777403325604]
We propose a novel model combining Graph Neural Networks and Marked Temporal Point Process (MTPP)
Our experiments demonstrate the superior performance of our model in terms of both model accuracy and training efficiency.
arXiv Detail & Related papers (2022-05-21T15:30:25Z) - PILED: An Identify-and-Localize Framework for Few-Shot Event Detection [79.66042333016478]
In our study, we employ cloze prompts to elicit event-related knowledge from pretrained language models.
We minimize the number of type-specific parameters, enabling our model to quickly adapt to event detection tasks for new types.
arXiv Detail & Related papers (2022-02-15T18:01:39Z) - Learning Neural Models for Continuous-Time Sequences [0.0]
We study the properties of continuous-time event sequences (CTES) and design robust yet scalable neural network-based models to overcome the aforementioned problems.
In this work, we model the underlying generative distribution of events using marked temporal point processes (MTPP) to address a wide range of real-world problems.
arXiv Detail & Related papers (2021-11-13T20:39:15Z) - Conditional Generation of Temporally-ordered Event Sequences [29.44608199294757]
We present a conditional generation model capable of capturing event cooccurrence as well as temporality of event sequences.
This single model can address both temporal ordering, sorting a given sequence of events into the order they occurred, and event infilling, predicting new events which fit into a temporally-ordered sequence of existing ones.
arXiv Detail & Related papers (2020-12-31T18:10:18Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.