Scalable and adaptive variational Bayes methods for Hawkes processes
- URL: http://arxiv.org/abs/2212.00293v2
- Date: Fri, 1 Sep 2023 00:14:55 GMT
- Title: Scalable and adaptive variational Bayes methods for Hawkes processes
- Authors: Deborah Sulem, Vincent Rivoirard and Judith Rousseau
- Abstract summary: We propose a novel sparsity-inducing procedure, and derive an adaptive mean-field variational algorithm for the popular sigmoid Hawkes processes.
Our algorithm is parallelisable and therefore computationally efficient in high-dimensional setting.
- Score: 4.580983642743026
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hawkes processes are often applied to model dependence and interaction
phenomena in multivariate event data sets, such as neuronal spike trains,
social interactions, and financial transactions. In the nonparametric setting,
learning the temporal dependence structure of Hawkes processes is generally a
computationally expensive task, all the more with Bayesian estimation methods.
In particular, for generalised nonlinear Hawkes processes, Monte-Carlo Markov
Chain methods applied to compute the doubly intractable posterior distribution
are not scalable to high-dimensional processes in practice. Recently, efficient
algorithms targeting a mean-field variational approximation of the posterior
distribution have been proposed. In this work, we first unify existing
variational Bayes approaches under a general nonparametric inference framework,
and analyse the asymptotic properties of these methods under easily verifiable
conditions on the prior, the variational class, and the nonlinear model.
Secondly, we propose a novel sparsity-inducing procedure, and derive an
adaptive mean-field variational algorithm for the popular sigmoid Hawkes
processes. Our algorithm is parallelisable and therefore computationally
efficient in high-dimensional setting. Through an extensive set of numerical
simulations, we also demonstrate that our procedure is able to adapt to the
dimensionality of the parameter of the Hawkes process, and is partially robust
to some type of model mis-specification.
Related papers
- An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Variational Gaussian Process Diffusion Processes [17.716059928867345]
Diffusion processes are a class of differential equations (SDEs) providing a rich family of expressive models.
Probabilistic inference and learning under generative models with latent processes endowed with a non-linear diffusion process prior are intractable problems.
We build upon work within variational inference, approximating the posterior process as a linear diffusion process, and point out pathologies in the approach.
arXiv Detail & Related papers (2023-06-03T09:43:59Z) - Numerically Stable Sparse Gaussian Processes via Minimum Separation
using Cover Trees [57.67528738886731]
We study the numerical stability of scalable sparse approximations based on inducing points.
For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.
arXiv Detail & Related papers (2022-10-14T15:20:17Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Compositional Modeling of Nonlinear Dynamical Systems with ODE-based
Random Features [0.0]
We present a novel, domain-agnostic approach to tackling this problem.
We use compositions of physics-informed random features, derived from ordinary differential equations.
We find that our approach achieves comparable performance to a number of other probabilistic models on benchmark regression tasks.
arXiv Detail & Related papers (2021-06-10T17:55:13Z) - Nonlinear Hawkes Processes in Time-Varying System [37.80255010291703]
Hawkes processes are a class of point processes that have the ability to model the self- and mutual-exciting phenomena.
This work proposes a flexible, nonlinear and nonhomogeneous variant where a state process is incorporated to interact with the point processes.
For inference, we utilize the latent variable augmentation technique to design two efficient Bayesian inference algorithms.
arXiv Detail & Related papers (2021-06-09T07:06:05Z) - Nonlinear Hawkes Process with Gaussian Process Self Effects [3.441953136999684]
Hawkes processes are used to model time--continuous point processes with history dependence.
Here we propose an extended model where the self--effects are of both excitatory and inhibitory type.
We continue the line of work of Bayesian inference for Hawkes processes, and our approach dispenses with the necessity of estimating a branching structure for the posterior.
arXiv Detail & Related papers (2021-05-20T09:20:35Z) - Statistical optimality and stability of tangent transform algorithms in
logit models [6.9827388859232045]
We provide conditions on the data generating process to derive non-asymptotic upper bounds to the risk incurred by the logistical optima.
In particular, we establish local variation of the algorithm without any assumptions on the data-generating process.
We explore a special case involving a semi-orthogonal design under which a global convergence is obtained.
arXiv Detail & Related papers (2020-10-25T05:15:13Z) - Robust, Accurate Stochastic Optimization for Variational Inference [68.83746081733464]
We show that common optimization methods lead to poor variational approximations if the problem is moderately large.
Motivated by these findings, we develop a more robust and accurate optimization framework by viewing the underlying algorithm as producing a Markov chain.
arXiv Detail & Related papers (2020-09-01T19:12:11Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.