Summary Markov Models for Event Sequences
- URL: http://arxiv.org/abs/2205.03375v1
- Date: Fri, 6 May 2022 17:16:24 GMT
- Title: Summary Markov Models for Event Sequences
- Authors: Debarun Bhattacharjya, Saurabh Sihag, Oktie Hassanzadeh, Liza Bialik
- Abstract summary: We propose a family of models for sequences of different types of events without meaningful time stamps.
The probability of observing an event type depends only on a summary of historical occurrences of its influencing set of event types.
We show that a unique minimal influencing set exists for any set of event types of interest and choice of summary function.
- Score: 23.777457032885813
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Datasets involving sequences of different types of events without meaningful
time stamps are prevalent in many applications, for instance when extracted
from textual corpora. We propose a family of models for such event sequences --
summary Markov models -- where the probability of observing an event type
depends only on a summary of historical occurrences of its influencing set of
event types. This Markov model family is motivated by Granger causal models for
time series, with the important distinction that only one event can occur in a
position in an event sequence. We show that a unique minimal influencing set
exists for any set of event types of interest and choice of summary function,
formulate two novel models from the general family that represent specific
sequence dynamics, and propose a greedy search algorithm for learning them from
event sequence data. We conduct an experimental investigation comparing the
proposed models with relevant baselines, and illustrate their knowledge
acquisition and discovery capabilities through case studies involving sequences
from text.
Related papers
- Improving Event Definition Following For Zero-Shot Event Detection [66.27883872707523]
Existing approaches on zero-shot event detection usually train models on datasets annotated with known event types.
We aim to improve zero-shot event detection by training models to better follow event definitions.
arXiv Detail & Related papers (2024-03-05T01:46:50Z) - Distilling Event Sequence Knowledge From Large Language Models [17.105913216452738]
Event sequence models have been found to be highly effective in the analysis and prediction of events.
We use Large Language Models to generate event sequences that can effectively be used for probabilistic event model construction.
We show that our approach can generate high-quality event sequences, filling a knowledge gap in the input KG.
arXiv Detail & Related papers (2024-01-14T09:34:42Z) - Probabilistic Modeling for Sequences of Sets in Continuous-Time [14.423456635520084]
We develop a general framework for modeling set-valued data in continuous-time.
We also develop inference methods that can use such models to answer probabilistic queries.
arXiv Detail & Related papers (2023-12-22T20:16:10Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - Zero-Shot On-the-Fly Event Schema Induction [61.91468909200566]
We present a new approach in which large language models are utilized to generate source documents that allow predicting, given a high-level event definition, the specific events, arguments, and relations between them.
Using our model, complete schemas on any topic can be generated on-the-fly without any manual data collection, i.e., in a zero-shot manner.
arXiv Detail & Related papers (2022-10-12T14:37:00Z) - Modeling Continuous Time Sequences with Intermittent Observations using
Marked Temporal Point Processes [25.074394338483575]
A large fraction of data generated via human activities can be represented as a sequence of events over a continuous-time.
Deep learning models over these continuous-time event sequences is a non-trivial task.
In this work, we provide a novel unsupervised model and inference method for learning MTPP in presence of event sequences with missing events.
arXiv Detail & Related papers (2022-06-23T18:23:20Z) - PILED: An Identify-and-Localize Framework for Few-Shot Event Detection [79.66042333016478]
In our study, we employ cloze prompts to elicit event-related knowledge from pretrained language models.
We minimize the number of type-specific parameters, enabling our model to quickly adapt to event detection tasks for new types.
arXiv Detail & Related papers (2022-02-15T18:01:39Z) - COHORTNEY: Deep Clustering for Heterogeneous Event Sequences [9.811178291117496]
Clustering of event sequences is widely applicable in domains such as healthcare, marketing, and finance.
We propose COHORTNEY as a novel deep learning method for clustering heterogeneous event sequences.
Our results show that COHORTNEY vastly outperforms in speed and cluster quality the state-of-the-art algorithm for clustering event sequences.
arXiv Detail & Related papers (2021-04-03T16:12:21Z) - Conditional Generation of Temporally-ordered Event Sequences [29.44608199294757]
We present a conditional generation model capable of capturing event cooccurrence as well as temporality of event sequences.
This single model can address both temporal ordering, sorting a given sequence of events into the order they occurred, and event infilling, predicting new events which fit into a temporally-ordered sequence of existing ones.
arXiv Detail & Related papers (2020-12-31T18:10:18Z) - Multi-Scale One-Class Recurrent Neural Networks for Discrete Event
Sequence Anomaly Detection [63.825781848587376]
We propose OC4Seq, a one-class recurrent neural network for detecting anomalies in discrete event sequences.
Specifically, OC4Seq embeds the discrete event sequences into latent spaces, where anomalies can be easily detected.
arXiv Detail & Related papers (2020-08-31T04:48:22Z) - Team RUC_AIM3 Technical Report at Activitynet 2020 Task 2: Exploring
Sequential Events Detection for Dense Video Captioning [63.91369308085091]
We propose a novel and simple model for event sequence generation and explore temporal relationships of the event sequence in the video.
The proposed model omits inefficient two-stage proposal generation and directly generates event boundaries conditioned on bi-directional temporal dependency in one pass.
The overall system achieves state-of-the-art performance on the dense-captioning events in video task with 9.894 METEOR score on the challenge testing set.
arXiv Detail & Related papers (2020-06-14T13:21:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.