Advances in Temporal Point Processes: Bayesian, Deep, and LLM Approaches
- URL: http://arxiv.org/abs/2501.14291v1
- Date: Fri, 24 Jan 2025 07:13:26 GMT
- Title: Advances in Temporal Point Processes: Bayesian, Deep, and LLM Approaches
- Authors: Feng Zhou, Quyu Kong, Yixuan Zhang,
- Abstract summary: Temporal point processes (TPPs) are process models used to characterize event sequences occurring in continuous time.
In recent years, advances in deep learning have spurred the development of neural TPPs, enabling greater flexibility and expressiveness.
The emergence of large language models (LLMs) has further sparked excitement, offering new possibilities for modeling and analyzing event sequences.
- Score: 22.746572585891233
- License:
- Abstract: Temporal point processes (TPPs) are stochastic process models used to characterize event sequences occurring in continuous time. Traditional statistical TPPs have a long-standing history, with numerous models proposed and successfully applied across diverse domains. In recent years, advances in deep learning have spurred the development of neural TPPs, enabling greater flexibility and expressiveness in capturing complex temporal dynamics. The emergence of large language models (LLMs) has further sparked excitement, offering new possibilities for modeling and analyzing event sequences by leveraging their rich contextual understanding. This survey presents a comprehensive review of recent research on TPPs from three perspectives: Bayesian, deep learning, and LLM approaches. We begin with a review of the fundamental concepts of TPPs, followed by an in-depth discussion of model design and parameter estimation techniques in these three frameworks. We also revisit classic application areas of TPPs to highlight their practical relevance. Finally, we outline challenges and promising directions for future research.
Related papers
- Neural Spatiotemporal Point Processes: Trends and Challenges [4.770461921490678]
Spatiotemporal point processes (STPPs) are probabilistic models for events occurring in continuous space and time.
In this review, we categorize existing approaches, unify key choices, and explain the challenges of working with this data modality.
arXiv Detail & Related papers (2025-02-13T14:01:15Z) - TPP-Gaze: Modelling Gaze Dynamics in Space and Time with Neural Temporal Point Processes [63.95928298690001]
We present TPP-Gaze, a novel and principled approach to model scanpath dynamics based on Neural Temporal Point Process (TPP)
Our results show the overall superior performance of the proposed model compared to state-of-the-art approaches.
arXiv Detail & Related papers (2024-10-30T19:22:38Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - Continual Learning with Pre-Trained Models: A Survey [61.97613090666247]
Continual Learning aims to overcome the catastrophic forgetting of former knowledge when learning new ones.
This paper presents a comprehensive survey of the latest advancements in PTM-based CL.
arXiv Detail & Related papers (2024-01-29T18:27:52Z) - On the Predictive Accuracy of Neural Temporal Point Process Models for
Continuous-time Event Data [3.13468877208035]
Temporal Point Processes (TPPs) serve as the standard mathematical framework for modeling asynchronous event sequences in continuous time.
Researchers have proposed Neural TPPs, which leverage neural network parametrizations to offer more flexible and efficient modeling.
This study systematically evaluates the predictive accuracy of state-of-the-art neural TPP models.
arXiv Detail & Related papers (2023-06-29T16:14:43Z) - A Survey on Deep Learning based Time Series Analysis with Frequency
Transformation [74.3919960186696]
Frequency transformation (FT) has been increasingly incorporated into deep learning models to enhance state-of-the-art accuracy and efficiency in time series analysis.
Despite the growing attention and the proliferation of research in this emerging field, there is currently a lack of a systematic review and in-depth analysis of deep learning-based time series models with FT.
We present a comprehensive review that systematically investigates and summarizes the recent research advancements in deep learning-based time series analysis with FT.
arXiv Detail & Related papers (2023-02-04T14:33:07Z) - Modeling Time-Series and Spatial Data for Recommendations and Other
Applications [1.713291434132985]
We address the problems that may arise due to the poor quality of CTES data being fed into a recommender system.
To improve the quality of the CTES data, we address a fundamental problem of overcoming missing events in temporal sequences.
We extend their abilities to design solutions for large-scale CTES retrieval and human activity prediction.
arXiv Detail & Related papers (2022-12-25T09:34:15Z) - Pre-Trained Models: Past, Present and Future [126.21572378910746]
Large-scale pre-trained models (PTMs) have recently achieved great success and become a milestone in the field of artificial intelligence (AI)
By storing knowledge into huge parameters and fine-tuning on specific tasks, the rich knowledge implicitly encoded in huge parameters can benefit a variety of downstream tasks.
It is now the consensus of the AI community to adopt PTMs as backbone for downstream tasks rather than learning models from scratch.
arXiv Detail & Related papers (2021-06-14T02:40:32Z) - Neural Temporal Point Processes: A Review [25.969319777457606]
Temporal point processes (TPP) are probabilistic generative models for continuous-time event sequences.
neural TPPs combine the fundamental ideas from point process literature with deep learning approaches.
arXiv Detail & Related papers (2021-04-08T06:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.