Cumulative Distribution Function based General Temporal Point Processes
- URL: http://arxiv.org/abs/2402.00388v1
- Date: Thu, 1 Feb 2024 07:21:30 GMT
- Title: Cumulative Distribution Function based General Temporal Point Processes
- Authors: Maolin Wang, Yu Pan, Zenglin Xu, Ruocheng Guo, Xiangyu Zhao, Wanyu
Wang, Yiqi Wang, Zitao Liu, Langming Liu
- Abstract summary: CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
- Score: 49.758080415846884
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal Point Processes (TPPs) hold a pivotal role in modeling event
sequences across diverse domains, including social networking and e-commerce,
and have significantly contributed to the advancement of recommendation systems
and information retrieval strategies. Through the analysis of events such as
user interactions and transactions, TPPs offer valuable insights into
behavioral patterns, facilitating the prediction of future trends. However,
accurately forecasting future events remains a formidable challenge due to the
intricate nature of these patterns. The integration of Neural Networks with
TPPs has ushered in the development of advanced deep TPP models. While these
models excel at processing complex and nonlinear temporal data, they encounter
limitations in modeling intensity functions, grapple with computational
complexities in integral computations, and struggle to capture long-range
temporal dependencies effectively. In this study, we introduce the CuFun model,
representing a novel approach to TPPs that revolves around the Cumulative
Distribution Function (CDF). CuFun stands out by uniquely employing a monotonic
neural network for CDF representation, utilizing past events as a scaling
factor. This innovation significantly bolsters the model's adaptability and
precision across a wide range of data scenarios. Our approach addresses several
critical issues inherent in traditional TPP modeling: it simplifies
log-likelihood calculations, extends applicability beyond predefined density
function forms, and adeptly captures long-range temporal patterns. Our
contributions encompass the introduction of a pioneering CDF-based TPP model,
the development of a methodology for incorporating past event information into
future event prediction, and empirical validation of CuFun's effectiveness
through extensive experimentation on synthetic and real-world datasets.
Related papers
- MITA: Bridging the Gap between Model and Data for Test-time Adaptation [68.62509948690698]
Test-Time Adaptation (TTA) has emerged as a promising paradigm for enhancing the generalizability of models.
We propose Meet-In-The-Middle based MITA, which introduces energy-based optimization to encourage mutual adaptation of the model and data from opposing directions.
arXiv Detail & Related papers (2024-10-12T07:02:33Z) - TPP-LLM: Modeling Temporal Point Processes by Efficiently Fine-Tuning Large Language Models [0.0]
Temporal point processes (TPPs) are widely used to model the timing and occurrence of events in domains such as social networks, transportation systems, and e-commerce.
We introduce TPP-LLM, a novel framework that integrates large language models (LLMs) with TPPs to capture both the semantic and temporal aspects of event sequences.
arXiv Detail & Related papers (2024-10-02T22:17:24Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - Enhancing Asynchronous Time Series Forecasting with Contrastive
Relational Inference [21.51753838306655]
Temporal point processes(TPPs) are the standard method for modeling such.
Existing TPP models have focused on the conditional distribution of future events instead of explicitly modeling event interactions, imposing challenges for event predictions.
We propose a novel approach that leverages a Neural Inference (NRI) to learn a graph that infers interactions while simultaneously learning dynamics patterns from observational data.
arXiv Detail & Related papers (2023-09-06T09:47:03Z) - On the Predictive Accuracy of Neural Temporal Point Process Models for
Continuous-time Event Data [3.13468877208035]
Temporal Point Processes (TPPs) serve as the standard mathematical framework for modeling asynchronous event sequences in continuous time.
Researchers have proposed Neural TPPs, which leverage neural network parametrizations to offer more flexible and efficient modeling.
This study systematically evaluates the predictive accuracy of state-of-the-art neural TPP models.
arXiv Detail & Related papers (2023-06-29T16:14:43Z) - Intensity-free Convolutional Temporal Point Process: Incorporating Local
and Global Event Contexts [30.534921874640585]
We propose a novel TPP modelling approach that combines local and global contexts by integrating a continuous-time convolutional event encoder with an RNN.
The presented framework is flexible and scalable to handle large datasets with long sequences and complex latent patterns.
To our best knowledge, this is the first work that applies convolutional neural networks to TPP modelling.
arXiv Detail & Related papers (2023-06-24T22:57:40Z) - Exploring Generative Neural Temporal Point Process [37.1875644118684]
generative models such as denoising diffusion and score matching models have achieved great progress in image generating tasks.
We try to fill the gap by designing a unified textbfgenerative framework for textbfneural textbftemporal textbfpoint textbfprocess.
arXiv Detail & Related papers (2022-08-03T06:56:28Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.