Neural Temporal Point Processes: A Review
- URL: http://arxiv.org/abs/2104.03528v1
- Date: Thu, 8 Apr 2021 06:10:50 GMT
- Title: Neural Temporal Point Processes: A Review
- Authors: Oleksandr Shchur, Ali Caner T\"urkmen, Tim Januschowski, Stephan
G\"unnemann
- Abstract summary: Temporal point processes (TPP) are probabilistic generative models for continuous-time event sequences.
neural TPPs combine the fundamental ideas from point process literature with deep learning approaches.
- Score: 25.969319777457606
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal point processes (TPP) are probabilistic generative models for
continuous-time event sequences. Neural TPPs combine the fundamental ideas from
point process literature with deep learning approaches, thus enabling
construction of flexible and efficient models. The topic of neural TPPs has
attracted significant attention in the recent years, leading to the development
of numerous new architectures and applications for this class of models. In
this review paper we aim to consolidate the existing body of knowledge on
neural TPPs. Specifically, we focus on important design choices and general
principles for defining neural TPP models. Next, we provide an overview of
application areas commonly considered in the literature. We conclude this
survey with the list of open challenges and important directions for future
work in the field of neural TPPs.
Related papers
- TPP-Gaze: Modelling Gaze Dynamics in Space and Time with Neural Temporal Point Processes [63.95928298690001]
We present TPP-Gaze, a novel and principled approach to model scanpath dynamics based on Neural Temporal Point Process (TPP)
Our results show the overall superior performance of the proposed model compared to state-of-the-art approaches.
arXiv Detail & Related papers (2024-10-30T19:22:38Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - On Non-asymptotic Theory of Recurrent Neural Networks in Temporal Point Processes [10.4442505961159]
temporal point process (TPP) is an important tool for modeling and predicting irregularly timed events across various domains.
Recent neural network (RNN)-based TPPs have shown practical advantages over traditional parametric TPP models.
In this paper, we establish the excess risk bounds of RNN-TPPs under many well-known TPP settings.
arXiv Detail & Related papers (2024-06-02T06:19:25Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - On the Predictive Accuracy of Neural Temporal Point Process Models for
Continuous-time Event Data [3.13468877208035]
Temporal Point Processes (TPPs) serve as the standard mathematical framework for modeling asynchronous event sequences in continuous time.
Researchers have proposed Neural TPPs, which leverage neural network parametrizations to offer more flexible and efficient modeling.
This study systematically evaluates the predictive accuracy of state-of-the-art neural TPP models.
arXiv Detail & Related papers (2023-06-29T16:14:43Z) - Deep networks for system identification: a Survey [56.34005280792013]
System identification learns mathematical descriptions of dynamic systems from input-output data.
Main aim of the identified model is to predict new data from previous observations.
We discuss architectures commonly adopted in the literature, like feedforward, convolutional, and recurrent networks.
arXiv Detail & Related papers (2023-01-30T12:38:31Z) - Meta Temporal Point Processes [13.525125302111844]
temporal point process (TPP) is a process where its realization is a sequence of discrete events in time.
Recent work in TPPs model the process using a neural network in a supervised learning framework.
We propose to train TPPs in a meta learning framework, where each sequence is treated as a different task.
arXiv Detail & Related papers (2023-01-27T23:21:07Z) - Modeling Time-Series and Spatial Data for Recommendations and Other
Applications [1.713291434132985]
We address the problems that may arise due to the poor quality of CTES data being fed into a recommender system.
To improve the quality of the CTES data, we address a fundamental problem of overcoming missing events in temporal sequences.
We extend their abilities to design solutions for large-scale CTES retrieval and human activity prediction.
arXiv Detail & Related papers (2022-12-25T09:34:15Z) - DriPP: Driven Point Processes to Model Stimuli Induced Patterns in M/EEG
Signals [62.997667081978825]
We develop a novel statistical point process model-called driven temporal point processes (DriPP)
We derive a fast and principled expectation-maximization (EM) algorithm to estimate the parameters of this model.
Results on standard MEG datasets demonstrate that our methodology reveals event-related neural responses.
arXiv Detail & Related papers (2021-12-08T13:07:21Z) - Rethinking Generalization of Neural Models: A Named Entity Recognition
Case Study [81.11161697133095]
We take the NER task as a testbed to analyze the generalization behavior of existing models from different perspectives.
Experiments with in-depth analyses diagnose the bottleneck of existing neural NER models.
As a by-product of this paper, we have open-sourced a project that involves a comprehensive summary of recent NER papers.
arXiv Detail & Related papers (2020-01-12T04:33:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.