Sensing Cox Processes via Posterior Sampling and Positive Bases
- URL: http://arxiv.org/abs/2110.11181v1
- Date: Thu, 21 Oct 2021 14:47:06 GMT
- Title: Sensing Cox Processes via Posterior Sampling and Positive Bases
- Authors: Mojm\'ir Mutn\'y, Andreas Krause
- Abstract summary: We study adaptive sensing of point processes, a widely used model from spatial statistics.
We model the intensity function as a sample from a truncated Gaussian process, represented in a specially constructed positive basis.
Our adaptive sensing algorithms use Langevin dynamics and are based on posterior sampling (textscCox-Thompson) and top-two posterior sampling (textscTop2) principles.
- Score: 56.82162768921196
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study adaptive sensing of Cox point processes, a widely used model from
spatial statistics. We introduce three tasks: maximization of captured events,
search for the maximum of the intensity function and learning level sets of the
intensity function. We model the intensity function as a sample from a
truncated Gaussian process, represented in a specially constructed positive
basis. In this basis, the positivity constraint on the intensity function has a
simple form. We show how an minimal description positive basis can be adapted
to the covariance kernel, non-stationarity and make connections to common
positive bases from prior works. Our adaptive sensing algorithms use Langevin
dynamics and are based on posterior sampling (\textsc{Cox-Thompson}) and
top-two posterior sampling (\textsc{Top2}) principles. With latter, the
difference between samples serves as a surrogate to the uncertainty. We
demonstrate the approach using examples from environmental monitoring and crime
rate modeling, and compare it to the classical Bayesian experimental design
approach.
Related papers
- A Kernel-Based Conditional Two-Sample Test Using Nearest Neighbors (with Applications to Calibration, Regression Curves, and Simulation-Based Inference) [3.622435665395788]
We introduce a kernel-based measure for detecting differences between two conditional distributions.
When the two conditional distributions are the same, the estimate has a Gaussian limit and its variance has a simple form that can be easily estimated from the data.
We also provide a resampling based test using our estimate that applies to the conditional goodness-of-fit problem.
arXiv Detail & Related papers (2024-07-23T15:04:38Z) - Exact Bayesian Gaussian Cox Processes Using Random Integral [0.0]
Posterior inference of an intensity function involves an intractable integral in the likelihood resulting in doubly intractable posterior distribution.
We propose a nonparametric Bayesian approach for estimating the intensity function of an inhomogeneous Poisson process without reliance on large data augmentation or approximations of the likelihood function.
We demonstrate the utility of our method in three real-world scenarios including temporal and spatial event data, as well as aggregated time count data collected at multiple resolutions.
arXiv Detail & Related papers (2024-06-28T08:11:33Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Learning to Importance Sample in Primary Sample Space [22.98252856114423]
We propose a novel importance sampling technique that uses a neural network to learn how to sample from a desired density represented by a set of samples.
We show that our approach leads to effective variance reduction in several practical scenarios.
arXiv Detail & Related papers (2018-08-23T16:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.