A Nonparametric Discrete Hawkes Model with a Collapsed Gaussian-Process Prior
- URL: http://arxiv.org/abs/2509.21996v1
- Date: Fri, 26 Sep 2025 07:23:57 GMT
- Title: A Nonparametric Discrete Hawkes Model with a Collapsed Gaussian-Process Prior
- Authors: Trinnhallen Brisley, Gordon Ross, Daniel Paulin,
- Abstract summary: We propose a nonparametric framework that places Gaussian process priors on both the baseline and the excitation.<n>This yields smooth, data-adaptive structure without prespecifying trends, periodicities, or decay shapes.<n>In simulations, GP-DHP recovers diverse excitation shapes and evolving baselines.
- Score: 0.5352699766206809
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hawkes process models are used in settings where past events increase the likelihood of future events occurring. Many applications record events as counts on a regular grid, yet discrete-time Hawkes models remain comparatively underused and are often constrained by fixed-form baselines and excitation kernels. In particular, there is a lack of flexible, nonparametric treatments of both the baseline and the excitation in discrete time. To this end, we propose the Gaussian Process Discrete Hawkes Process (GP-DHP), a nonparametric framework that places Gaussian process priors on both the baseline and the excitation and performs inference through a collapsed latent representation. This yields smooth, data-adaptive structure without prespecifying trends, periodicities, or decay shapes, and enables maximum a posteriori (MAP) estimation with near-linear-time \(O(T\log T)\) complexity. A closed-form projection recovers interpretable baseline and excitation functions from the optimized latent trajectory. In simulations, GP-DHP recovers diverse excitation shapes and evolving baselines. In case studies on U.S. terrorism incidents and weekly Cryptosporidiosis counts, it improves test predictive log-likelihood over standard parametric discrete Hawkes baselines while capturing bursts, delays, and seasonal background variation. The results indicate that flexible discrete-time self-excitation can be achieved without sacrificing scalability or interpretability.
Related papers
- Exact Recovery of Non-Random Missing Multidimensional Time Series via Temporal Isometric Delay-Embedding Transform [6.015902220215394]
Non-random missing data is a ubiquitous yet undertreated flaw in multidimensional time series.<n>We propose a temporal isometric delay-embedding transform, which constructs a Hankel tensor whose low-rankness is naturally induced by the smoothness and periodicity of the underlying time series.<n>Our proposed model achieves exact recovery, as confirmed by simulation experiments under various non-random missing patterns.
arXiv Detail & Related papers (2025-12-11T01:04:27Z) - Simulation-based inference via telescoping ratio estimation for trawl processes [0.0]
We propose a fast, accurate, sample-efficient Markov-based inference framework for intractable processes.<n>We use Chebyshev approximations to efficiently generate independent posterior samples, enabling accurate inference even when chain Monte Carlo methods mix poorly.<n>We demonstrate the method's effectiveness on trawl processes, a class of flexible infinitely divisible models that applied to energy demand data.
arXiv Detail & Related papers (2025-10-05T05:26:46Z) - A Simple Approximate Bayesian Inference Neural Surrogate for Stochastic Petri Net Models [0.0]
We introduce a neural-network-based approximation of the posterior distribution framework.<n>Our model employs a lightweight 1D Convolutional Residual Network trained end-to-end on Gillespie-simulated SPN realizations.<n>On synthetic SPNs with 20% missing events, our surrogate recovers rate-function coefficients with an RMSE = 0.108 and substantially runs faster than traditional Bayesian approaches.
arXiv Detail & Related papers (2025-07-14T18:31:19Z) - Point processes with event time uncertainty [16.64005584511643]
We introduce a framework to model time-uncertain point processes possibly on a network.
We experimentally show that the proposed approach outperforms previous General Linear model (GLM) baselines on simulated and real data.
arXiv Detail & Related papers (2024-11-05T00:46:09Z) - ProGen: Revisiting Probabilistic Spatial-Temporal Time Series Forecasting from a Continuous Generative Perspective Using Stochastic Differential Equations [18.64802090861607]
ProGen Pro provides a robust solution that effectively captures dependencies while managing uncertainty.
Our experiments on four benchmark traffic datasets demonstrate that ProGen Pro outperforms state-of-the-art deterministic probabilistic models.
arXiv Detail & Related papers (2024-11-02T14:37:30Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Spatiotemporal Besov Priors for Bayesian Inverse Problems [10.521038958248846]
Many inverse problems in data science require solutions derived from a sequence of computerized time-dependent objects.
Besmoothsov process (BP) defined by wavelet expansions with random coefficients has emerged as a more suitable solution.
arXiv Detail & Related papers (2023-06-28T17:14:49Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Interval-censored Hawkes processes [82.87738318505582]
We propose a model to estimate the parameters of a Hawkes process in interval-censored settings.
We show how a non-homogeneous approximation to the Hawkes admits a tractable likelihood in the interval-censored setting.
arXiv Detail & Related papers (2021-04-16T07:29:04Z) - Probabilistic Numeric Convolutional Neural Networks [80.42120128330411]
Continuous input signals like images and time series that are irregularly sampled or have missing values are challenging for existing deep learning methods.
We propose Probabilistic Convolutional Neural Networks which represent features as Gaussian processes (GPs)
We then define a convolutional layer as the evolution of a PDE defined on this GP, followed by a nonlinearity.
In experiments we show that our approach yields a $3times$ reduction of error from the previous state of the art on the SuperPixel-MNIST dataset and competitive performance on the medical time2012 dataset PhysioNet.
arXiv Detail & Related papers (2020-10-21T10:08:21Z) - Modeling of Spatio-Temporal Hawkes Processes with Randomized Kernels [15.556686221927501]
Inferring the dynamics of event processes hasly practical applications including crime prediction, and traffic forecasting.
We introduce on social-temporal Hawkes processes that are commonly used due to their capability to capture excitations between event occurrences.
We replace the spatial kernel calculations by randomized transformations and gradient descent to learn the process.
arXiv Detail & Related papers (2020-03-07T22:21:06Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.