HyperHawkes: Hypernetwork based Neural Temporal Point Process
- URL: http://arxiv.org/abs/2210.00213v1
- Date: Sat, 1 Oct 2022 07:14:19 GMT
- Title: HyperHawkes: Hypernetwork based Neural Temporal Point Process
- Authors: Manisha Dubey, P.K. Srijith, Maunendra Sankar Desarkar
- Abstract summary: Temporal point process serves as an essential tool for modeling time-to-event data in continuous time space.
It is not generalizable to predict events from unseen sequences in dynamic environment.
We propose textitHyperHawkes, a hypernetwork based temporal point process framework.
- Score: 5.607676459156789
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal point process serves as an essential tool for modeling time-to-event
data in continuous time space. Despite having massive amounts of event sequence
data from various domains like social media, healthcare etc., real world
application of temporal point process faces two major challenges: 1) it is not
generalizable to predict events from unseen sequences in dynamic environment 2)
they are not capable of thriving in continually evolving environment with
minimal supervision while retaining previously learnt knowledge. To tackle
these issues, we propose \textit{HyperHawkes}, a hypernetwork based temporal
point process framework which is capable of modeling time of occurrence of
events for unseen sequences. Thereby, we solve the problem of zero-shot
learning for time-to-event modeling. We also develop a hypernetwork based
continually learning temporal point process for continuous modeling of
time-to-event sequences with minimal forgetting. In this way,
\textit{HyperHawkes} augments the temporal point process with zero-shot
modeling and continual learning capabilities. We demonstrate the application of
the proposed framework through our experiments on two real-world datasets. Our
results show the efficacy of the proposed approach in terms of predicting
future events under zero-shot regime for unseen event sequences. We also show
that the proposed model is able to predict sequences continually while
retaining information from previous event sequences, hence mitigating
catastrophic forgetting for time-to-event data.
Related papers
- EventFlow: Forecasting Continuous-Time Event Data with Flow Matching [12.976042923229466]
We propose EventFlow, a non-autoregressive generative model for temporal point processes.
Our model builds on the flow matching framework in order to directly learn joint distributions over event times, side-stepping the autoregressive process.
arXiv Detail & Related papers (2024-10-09T20:57:00Z) - XTSFormer: Cross-Temporal-Scale Transformer for Irregular Time Event
Prediction [9.240950990926796]
Event prediction aims to forecast the time and type of a future event based on a historical event sequence.
Despite its significance, several challenges exist, including the irregularity of time intervals between consecutive events, the existence of cycles, periodicity, and multi-scale event interactions.
arXiv Detail & Related papers (2024-02-03T20:33:39Z) - Non-Autoregressive Diffusion-based Temporal Point Processes for
Continuous-Time Long-Term Event Prediction [8.88485011274486]
We propose a diffusion-based non-autoregressive temporal point process model for long-term event prediction in continuous time.
In order to perform diffusion processes on event sequences, we develop a bidirectional map between target event sequences and the Euclidean vector space.
Experiments are conducted to prove the superiority of our proposed model over state-of-the-art methods on long-term event prediction in continuous time.
arXiv Detail & Related papers (2023-11-02T06:52:44Z) - Continuous-time convolutions model of event sequences [46.3471121117337]
Event sequences are non-uniform and sparse, making traditional models unsuitable.
We propose COTIC, a method based on an efficient convolution neural network designed to handle the non-uniform occurrence of events over time.
COTIC outperforms existing models in predicting the next event time and type, achieving an average rank of 1.5 compared to 3.714 for the nearest competitor.
arXiv Detail & Related papers (2023-02-13T10:34:51Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - Modeling Continuous Time Sequences with Intermittent Observations using
Marked Temporal Point Processes [25.074394338483575]
A large fraction of data generated via human activities can be represented as a sequence of events over a continuous-time.
Deep learning models over these continuous-time event sequences is a non-trivial task.
In this work, we provide a novel unsupervised model and inference method for learning MTPP in presence of event sequences with missing events.
arXiv Detail & Related papers (2022-06-23T18:23:20Z) - CEP3: Community Event Prediction with Neural Point Process on Graph [59.434777403325604]
We propose a novel model combining Graph Neural Networks and Marked Temporal Point Process (MTPP)
Our experiments demonstrate the superior performance of our model in terms of both model accuracy and training efficiency.
arXiv Detail & Related papers (2022-05-21T15:30:25Z) - Learning Neural Models for Continuous-Time Sequences [0.0]
We study the properties of continuous-time event sequences (CTES) and design robust yet scalable neural network-based models to overcome the aforementioned problems.
In this work, we model the underlying generative distribution of events using marked temporal point processes (MTPP) to address a wide range of real-world problems.
arXiv Detail & Related papers (2021-11-13T20:39:15Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z) - A Spatial-Temporal Attentive Network with Spatial Continuity for
Trajectory Prediction [74.00750936752418]
We propose a novel model named spatial-temporal attentive network with spatial continuity (STAN-SC)
First, spatial-temporal attention mechanism is presented to explore the most useful and important information.
Second, we conduct a joint feature sequence based on the sequence and instant state information to make the generative trajectories keep spatial continuity.
arXiv Detail & Related papers (2020-03-13T04:35:50Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.