FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels
- URL: http://arxiv.org/abs/2210.04635v3
- Date: Wed, 2 Aug 2023 11:42:11 GMT
- Title: FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels
- Authors: Guillaume Staerman, C\'edric Allain, Alexandre Gramfort and Thomas
Moreau
- Abstract summary: This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
- Score: 82.53569355337586
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal point processes (TPP) are a natural tool for modeling event-based
data. Among all TPP models, Hawkes processes have proven to be the most widely
used, mainly due to their adequate modeling for various applications,
particularly when considering exponential or non-parametric kernels. Although
non-parametric kernels are an option, such models require large datasets. While
exponential kernels are more data efficient and relevant for specific
applications where events immediately trigger more events, they are ill-suited
for applications where latencies need to be estimated, such as in neuroscience.
This work aims to offer an efficient solution to TPP inference using general
parametric kernels with finite support. The developed solution consists of a
fast $\ell_2$ gradient-based solver leveraging a discretized version of the
events. After theoretically supporting the use of discretization, the
statistical and computational efficiency of the novel approach is demonstrated
through various numerical experiments. Finally, the method's effectiveness is
evaluated by modeling the occurrence of stimuli-induced patterns from brain
signals recorded with magnetoencephalography (MEG). Given the use of general
parametric kernels, results show that the proposed approach leads to an
improved estimation of pattern latency than the state-of-the-art.
Related papers
- Compactly-supported nonstationary kernels for computing exact Gaussian processes on big data [2.8377382540923004]
We derive an alternative kernel that can discover and encode both sparsity and nonstationarity.
We demonstrate the favorable performance of our novel kernel relative to existing exact and approximate GP methods.
We also conduct space-time prediction based on more than one million measurements of daily maximum temperature.
arXiv Detail & Related papers (2024-11-07T20:07:21Z) - Point processes with event time uncertainty [16.64005584511643]
We introduce a framework to model time-uncertain point processes possibly on a network.
We experimentally show that the proposed approach outperforms previous General Linear model (GLM) baselines on simulated and real data.
arXiv Detail & Related papers (2024-11-05T00:46:09Z) - Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels [57.46832672991433]
We propose a novel equation discovery method based on Kernel learning and BAyesian Spike-and-Slab priors (KBASS)
We use kernel regression to estimate the target function, which is flexible, expressive, and more robust to data sparsity and noises.
We develop an expectation-propagation expectation-maximization algorithm for efficient posterior inference and function estimation.
arXiv Detail & Related papers (2023-10-09T03:55:09Z) - Data-driven Modeling and Inference for Bayesian Gaussian Process ODEs
via Double Normalizing Flows [28.62579476863723]
We introduce normalizing flows to re parameterize the ODE vector field, resulting in a data-driven prior distribution.
We also apply normalizing flows to the posterior inference of GP ODEs to resolve the issue of strong mean-field assumptions.
We validate the effectiveness of our approach on simulated dynamical systems and real-world human motion data.
arXiv Detail & Related papers (2023-09-17T09:28:47Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Neural Spectral Marked Point Processes [18.507050473968985]
We introduce a novel and general neural network-based non-stationary influence kernel for handling complex discrete events.
We demonstrate the superior performance of our proposed method compared with the state-of-the-art on synthetic and real data.
arXiv Detail & Related papers (2021-06-20T23:00:37Z) - MuyGPs: Scalable Gaussian Process Hyperparameter Estimation Using Local
Cross-Validation [1.2233362977312945]
We present MuyGPs, a novel efficient GP hyper parameter estimation method.
MuyGPs builds upon prior methods that take advantage of the nearest neighbors structure of the data.
We show that our method outperforms all known competitors both in terms of time-to-solution and the root mean squared error of the predictions.
arXiv Detail & Related papers (2021-04-29T18:10:21Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.