HyperHawkes: Hypernetwork based Neural Temporal Point Process
- URL: http://arxiv.org/abs/2210.00213v1
- Date: Sat, 1 Oct 2022 07:14:19 GMT
- Title: HyperHawkes: Hypernetwork based Neural Temporal Point Process
- Authors: Manisha Dubey, P.K. Srijith, Maunendra Sankar Desarkar
- Abstract summary: Temporal point process serves as an essential tool for modeling time-to-event data in continuous time space.
It is not generalizable to predict events from unseen sequences in dynamic environment.
We propose textitHyperHawkes, a hypernetwork based temporal point process framework.
- Score: 5.607676459156789
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal point process serves as an essential tool for modeling time-to-event
data in continuous time space. Despite having massive amounts of event sequence
data from various domains like social media, healthcare etc., real world
application of temporal point process faces two major challenges: 1) it is not
generalizable to predict events from unseen sequences in dynamic environment 2)
they are not capable of thriving in continually evolving environment with
minimal supervision while retaining previously learnt knowledge. To tackle
these issues, we propose \textit{HyperHawkes}, a hypernetwork based temporal
point process framework which is capable of modeling time of occurrence of
events for unseen sequences. Thereby, we solve the problem of zero-shot
learning for time-to-event modeling. We also develop a hypernetwork based
continually learning temporal point process for continuous modeling of
time-to-event sequences with minimal forgetting. In this way,
\textit{HyperHawkes} augments the temporal point process with zero-shot
modeling and continual learning capabilities. We demonstrate the application of
the proposed framework through our experiments on two real-world datasets. Our
results show the efficacy of the proposed approach in terms of predicting
future events under zero-shot regime for unseen event sequences. We also show
that the proposed model is able to predict sequences continually while
retaining information from previous event sequences, hence mitigating
catastrophic forgetting for time-to-event data.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.