Conditional Generation of Temporally-ordered Event Sequences
- URL: http://arxiv.org/abs/2012.15786v1
- Date: Thu, 31 Dec 2020 18:10:18 GMT
- Title: Conditional Generation of Temporally-ordered Event Sequences
- Authors: Shih-Ting Lin, Nathanael Chambers, Greg Durrett
- Abstract summary: We present a conditional generation model capable of capturing event cooccurrence as well as temporality of event sequences.
This single model can address both temporal ordering, sorting a given sequence of events into the order they occurred, and event infilling, predicting new events which fit into a temporally-ordered sequence of existing ones.
- Score: 29.44608199294757
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Models encapsulating narrative schema knowledge have proven to be useful for
a range of event-related tasks, but these models typically do not engage with
temporal relationships between events. We present a a BART-based conditional
generation model capable of capturing event cooccurrence as well as temporality
of event sequences. This single model can address both temporal ordering,
sorting a given sequence of events into the order they occurred, and event
infilling, predicting new events which fit into a temporally-ordered sequence
of existing ones. Our model is trained as a denoising autoencoder: we take
temporally-ordered event sequences, shuffle them, delete some events, and then
attempting to recover the original event sequence. In this fashion, the model
learns to make inferences given incomplete knowledge about the events in an
underlying scenario. On the temporal ordering task, we show that our model is
able to unscramble event sequences from existing datasets without access to
explicitly labeled temporal training data, outperforming both a BERT-based
pairwise model and a BERT-based pointer network. On event infilling, human
evaluation shows that our model is able to generate events that fit better
temporally into the input events when compared to GPT-2 story completion
models.
Related papers
- EventFlow: Forecasting Continuous-Time Event Data with Flow Matching [12.976042923229466]
We propose EventFlow, a non-autoregressive generative model for temporal point processes.
Our model builds on the flow matching framework in order to directly learn joint distributions over event times, side-stepping the autoregressive process.
arXiv Detail & Related papers (2024-10-09T20:57:00Z) - Improving Event Definition Following For Zero-Shot Event Detection [66.27883872707523]
Existing approaches on zero-shot event detection usually train models on datasets annotated with known event types.
We aim to improve zero-shot event detection by training models to better follow event definitions.
arXiv Detail & Related papers (2024-03-05T01:46:50Z) - Distilling Event Sequence Knowledge From Large Language Models [17.105913216452738]
Event sequence models have been found to be highly effective in the analysis and prediction of events.
We use Large Language Models to generate event sequences that can effectively be used for probabilistic event model construction.
We show that our approach can generate high-quality event sequences, filling a knowledge gap in the input KG.
arXiv Detail & Related papers (2024-01-14T09:34:42Z) - HyperHawkes: Hypernetwork based Neural Temporal Point Process [5.607676459156789]
Temporal point process serves as an essential tool for modeling time-to-event data in continuous time space.
It is not generalizable to predict events from unseen sequences in dynamic environment.
We propose textitHyperHawkes, a hypernetwork based temporal point process framework.
arXiv Detail & Related papers (2022-10-01T07:14:19Z) - Unifying Event Detection and Captioning as Sequence Generation via
Pre-Training [53.613265415703815]
We propose a unified pre-training and fine-tuning framework to enhance the inter-task association between event detection and captioning.
Our model outperforms the state-of-the-art methods, and can be further boosted when pre-trained on extra large-scale video-text data.
arXiv Detail & Related papers (2022-07-18T14:18:13Z) - Modeling Continuous Time Sequences with Intermittent Observations using
Marked Temporal Point Processes [25.074394338483575]
A large fraction of data generated via human activities can be represented as a sequence of events over a continuous-time.
Deep learning models over these continuous-time event sequences is a non-trivial task.
In this work, we provide a novel unsupervised model and inference method for learning MTPP in presence of event sequences with missing events.
arXiv Detail & Related papers (2022-06-23T18:23:20Z) - Summary Markov Models for Event Sequences [23.777457032885813]
We propose a family of models for sequences of different types of events without meaningful time stamps.
The probability of observing an event type depends only on a summary of historical occurrences of its influencing set of event types.
We show that a unique minimal influencing set exists for any set of event types of interest and choice of summary function.
arXiv Detail & Related papers (2022-05-06T17:16:24Z) - Learning Temporal Rules from Noisy Timeseries Data [72.93572292157593]
We focus on uncovering the underlying atomic events and their relations that lead to the composite events within a noisy temporal data setting.
We propose a Neural Temporal Logic Programming (Neural TLP) which first learns implicit temporal relations between atomic events and then lifts logic rules for supervision.
arXiv Detail & Related papers (2022-02-11T01:29:02Z) - Event Data Association via Robust Model Fitting for Event-based Object Tracking [66.05728523166755]
We propose a novel Event Data Association (called EDA) approach to explicitly address the event association and fusion problem.
The proposed EDA seeks for event trajectories that best fit the event data, in order to perform unifying data association and information fusion.
The experimental results show the effectiveness of EDA under challenging scenarios, such as high speed, motion blur, and high dynamic range conditions.
arXiv Detail & Related papers (2021-10-25T13:56:00Z) - Future is not One-dimensional: Graph Modeling based Complex Event Schema
Induction for Event Prediction [90.75260063651763]
We introduce the concept of Temporal Complex Event: a graph-based schema representation that encompasses events, arguments, temporal connections and argument relations.
We release a new schema learning corpus containing 6,399 documents accompanied with event graphs, and manually constructed gold schemas.
arXiv Detail & Related papers (2021-04-13T16:41:05Z) - Team RUC_AIM3 Technical Report at Activitynet 2020 Task 2: Exploring
Sequential Events Detection for Dense Video Captioning [63.91369308085091]
We propose a novel and simple model for event sequence generation and explore temporal relationships of the event sequence in the video.
The proposed model omits inefficient two-stage proposal generation and directly generates event boundaries conditioned on bi-directional temporal dependency in one pass.
The overall system achieves state-of-the-art performance on the dense-captioning events in video task with 9.894 METEOR score on the challenge testing set.
arXiv Detail & Related papers (2020-06-14T13:21:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.