Spatiotemporal Clustering with Neyman-Scott Processes via Connections to
Bayesian Nonparametric Mixture Models
- URL: http://arxiv.org/abs/2201.05044v3
- Date: Mon, 11 Sep 2023 18:30:00 GMT
- Title: Spatiotemporal Clustering with Neyman-Scott Processes via Connections to
Bayesian Nonparametric Mixture Models
- Authors: Yixin Wang, Anthony Degleris, Alex H. Williams, and Scott W. Linderman
- Abstract summary: Neyman-Scott processes (NSPs) are point process models that generate clusters of points in time or space.
We demonstrate the potential of NSPs on a variety of applications including sequence detection in neural spike trains and event detection in document streams.
- Score: 28.121773284978406
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neyman-Scott processes (NSPs) are point process models that generate clusters
of points in time or space. They are natural models for a wide range of
phenomena, ranging from neural spike trains to document streams. The clustering
property is achieved via a doubly stochastic formulation: first, a set of
latent events is drawn from a Poisson process; then, each latent event
generates a set of observed data points according to another Poisson process.
This construction is similar to Bayesian nonparametric mixture models like the
Dirichlet process mixture model (DPMM) in that the number of latent events
(i.e. clusters) is a random variable, but the point process formulation makes
the NSP especially well suited to modeling spatiotemporal data. While many
specialized algorithms have been developed for DPMMs, comparatively fewer works
have focused on inference in NSPs. Here, we present novel connections between
NSPs and DPMMs, with the key link being a third class of Bayesian mixture
models called mixture of finite mixture models (MFMMs). Leveraging this
connection, we adapt the standard collapsed Gibbs sampling algorithm for DPMMs
to enable scalable Bayesian inference on NSP models. We demonstrate the
potential of Neyman-Scott processes on a variety of applications including
sequence detection in neural spike trains and event detection in document
streams.
Related papers
- A Bayesian Mixture Model of Temporal Point Processes with Determinantal Point Process Prior [21.23523473330637]
Asynchronous event sequence clustering aims to group similar event sequences in an unsupervised manner.
Our work provides a flexible learning framework for event sequence clustering, enabling automatic identification of the potential number of clusters.
It is applicable to a wide range of parametric temporal point processes, including neural network-based models.
arXiv Detail & Related papers (2024-11-07T03:21:30Z) - On Feynman--Kac training of partial Bayesian neural networks [1.6474447977095783]
Partial Bayesian neural networks (pBNNs) were shown to perform competitively with full Bayesian neural networks.
We propose an efficient sampling-based training strategy, wherein the training of a pBNN is formulated as simulating a Feynman--Kac model.
We show that our proposed training scheme outperforms the state of the art in terms of predictive performance.
arXiv Detail & Related papers (2023-10-30T15:03:15Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Generative modeling for time series via Schr{\"o}dinger bridge [0.0]
We propose a novel generative model for time series based on Schr"dinger bridge (SB) approach.
This consists in the entropic via optimal transport between a reference probability measure on path space and a target measure consistent with the joint data distribution of the time series.
arXiv Detail & Related papers (2023-04-11T09:45:06Z) - Overlap-guided Gaussian Mixture Models for Point Cloud Registration [61.250516170418784]
Probabilistic 3D point cloud registration methods have shown competitive performance in overcoming noise, outliers, and density variations.
This paper proposes a novel overlap-guided probabilistic registration approach that computes the optimal transformation from matched Gaussian Mixture Model (GMM) parameters.
arXiv Detail & Related papers (2022-10-17T08:02:33Z) - A new perspective on probabilistic image modeling [92.89846887298852]
We present a new probabilistic approach for image modeling capable of density estimation, sampling and tractable inference.
DCGMMs can be trained end-to-end by SGD from random initial conditions, much like CNNs.
We show that DCGMMs compare favorably to several recent PC and SPN models in terms of inference, classification and sampling.
arXiv Detail & Related papers (2022-03-21T14:53:57Z) - Bayesian Inference in High-Dimensional Time-Serieswith the Orthogonal
Stochastic Linear Mixing Model [2.7909426811685893]
Many modern time-series datasets contain large numbers of output response variables sampled for prolonged periods of time.
In this paper, we propose a new Markov chain Monte Carlo framework for the analysis of diverse, large-scale time-series datasets.
arXiv Detail & Related papers (2021-06-25T01:12:54Z) - DeepGMR: Learning Latent Gaussian Mixture Models for Registration [113.74060941036664]
Point cloud registration is a fundamental problem in 3D computer vision, graphics and robotics.
In this paper, we introduce Deep Gaussian Mixture Registration (DeepGMR), the first learning-based registration method.
Our proposed method shows favorable performance when compared with state-of-the-art geometry-based and learning-based registration methods.
arXiv Detail & Related papers (2020-08-20T17:25:16Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z) - Scalable Hybrid HMM with Gaussian Process Emission for Sequential
Time-series Data Clustering [13.845932997326571]
Hidden Markov Model (HMM) combined with Gaussian Process (GP) emission can be effectively used to estimate the hidden state.
This paper proposes a scalable learning method for HMM-GPSM.
arXiv Detail & Related papers (2020-01-07T07:28:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.