Intensity-free Convolutional Temporal Point Process: Incorporating Local
and Global Event Contexts
- URL: http://arxiv.org/abs/2306.14072v1
- Date: Sat, 24 Jun 2023 22:57:40 GMT
- Title: Intensity-free Convolutional Temporal Point Process: Incorporating Local
and Global Event Contexts
- Authors: Wang-Tao Zhou, Zhao Kang, Ling Tian, Yi Su
- Abstract summary: We propose a novel TPP modelling approach that combines local and global contexts by integrating a continuous-time convolutional event encoder with an RNN.
The presented framework is flexible and scalable to handle large datasets with long sequences and complex latent patterns.
To our best knowledge, this is the first work that applies convolutional neural networks to TPP modelling.
- Score: 30.534921874640585
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event prediction in the continuous-time domain is a crucial but rather
difficult task. Temporal point process (TPP) learning models have shown great
advantages in this area. Existing models mainly focus on encoding global
contexts of events using techniques like recurrent neural networks (RNNs) or
self-attention mechanisms. However, local event contexts also play an important
role in the occurrences of events, which has been largely ignored. Popular
convolutional neural networks, which are designated for local context
capturing, have never been applied to TPP modelling due to their incapability
of modelling in continuous time. In this work, we propose a novel TPP modelling
approach that combines local and global contexts by integrating a
continuous-time convolutional event encoder with an RNN. The presented
framework is flexible and scalable to handle large datasets with long sequences
and complex latent patterns. The experimental result shows that the proposed
model improves the performance of probabilistic sequential modelling and the
accuracy of event prediction. To our best knowledge, this is the first work
that applies convolutional neural networks to TPP modelling.
Related papers
- On Non-asymptotic Theory of Recurrent Neural Networks in Temporal Point Processes [10.4442505961159]
temporal point process (TPP) is an important tool for modeling and predicting irregularly timed events across various domains.
Recent neural network (RNN)-based TPPs have shown practical advantages over traditional parametric TPP models.
In this paper, we establish the excess risk bounds of RNN-TPPs under many well-known TPP settings.
arXiv Detail & Related papers (2024-06-02T06:19:25Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - Spatio-Temporal Graph Neural Point Process for Traffic Congestion Event
Prediction [16.530361912832763]
We propose a temporal graph neural point process framework, named STNPP, for traffic congestion event prediction.
Our method achieves superior performance in comparison to existing state-of-the-art approaches.
arXiv Detail & Related papers (2023-11-15T01:22:47Z) - Enhancing Asynchronous Time Series Forecasting with Contrastive
Relational Inference [21.51753838306655]
Temporal point processes(TPPs) are the standard method for modeling such.
Existing TPP models have focused on the conditional distribution of future events instead of explicitly modeling event interactions, imposing challenges for event predictions.
We propose a novel approach that leverages a Neural Inference (NRI) to learn a graph that infers interactions while simultaneously learning dynamics patterns from observational data.
arXiv Detail & Related papers (2023-09-06T09:47:03Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Exploring Generative Neural Temporal Point Process [37.1875644118684]
generative models such as denoising diffusion and score matching models have achieved great progress in image generating tasks.
We try to fill the gap by designing a unified textbfgenerative framework for textbfneural textbftemporal textbfpoint textbfprocess.
arXiv Detail & Related papers (2022-08-03T06:56:28Z) - CEP3: Community Event Prediction with Neural Point Process on Graph [59.434777403325604]
We propose a novel model combining Graph Neural Networks and Marked Temporal Point Process (MTPP)
Our experiments demonstrate the superior performance of our model in terms of both model accuracy and training efficiency.
arXiv Detail & Related papers (2022-05-21T15:30:25Z) - Mitigating Performance Saturation in Neural Marked Point Processes:
Architectures and Loss Functions [50.674773358075015]
We propose a simple graph-based network structure called GCHP, which utilizes only graph convolutional layers.
We show that GCHP can significantly reduce training time and the likelihood ratio loss with interarrival time probability assumptions can greatly improve the model performance.
arXiv Detail & Related papers (2021-07-07T16:59:14Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.