A novel sampler for Gauss-Hermite determinantal point processes with
application to Monte Carlo integration
- URL: http://arxiv.org/abs/2203.08061v1
- Date: Tue, 15 Mar 2022 16:54:03 GMT
- Title: A novel sampler for Gauss-Hermite determinantal point processes with
application to Monte Carlo integration
- Authors: Nicholas P Baskerville
- Abstract summary: We show how a new determinantal point process on $mathbbRd$ can be practically sampled and used to improve Monte Carlo integration.
Samples from this new process are shown to be useful in Monte Carlo integration against the Gaussian measure.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Determinantal points processes are a promising but relatively under-developed
tool in machine learning and statistical modelling, being the canonical
statistical example of distributions with repulsion. While their mathematical
formulation is elegant and appealing, their practical use, such as simply
sampling from them, is far from straightforward.Recent work has shown how a
particular type of determinantal point process defined on the compact
multidimensional space $[-1, 1]^d$ can be practically sampled and further shown
how such samples can be used to improve Monte Carlo integration.This work
extends those results to a new determinantal point process on $\mathbb{R}^d$ by
constructing a novel sampling scheme. Samples from this new process are shown
to be useful in Monte Carlo integration against Gaussian measure, which is
particularly relevant in machine learning applications.
Related papers
- Numerical Generalized Randomized Hamiltonian Monte Carlo for piecewise smooth target densities [0.0]
Generalized Hamiltonian Monte Carlo processes for sampling continuous densities with discontinuous gradient and piecewise smooth targets are proposed.
It is argued that the techniques lead to GRHMC processes that admit the desired target distribution as the invariant distribution in both scenarios.
arXiv Detail & Related papers (2025-04-25T09:41:57Z) - An Efficient Quasi-Random Sampling for Copulas [3.400056739248712]
This paper proposes the use of generative models, such as Generative Adrial Networks (GANs), to generate quasi-random samples for any copula.
GANs are a type of implicit generative models used to learn the distribution of complex data, thus facilitating easy sampling.
arXiv Detail & Related papers (2024-03-08T13:01:09Z) - Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - Stochastic Localization via Iterative Posterior Sampling [2.1383136715042417]
We consider a general localization framework and introduce an explicit class of observation processes, associated with flexible denoising schedules.
We provide a complete methodology, $textitStochastic localization via Iterative Posterior Sampling$ (SLIPS), to obtain approximate samples of this dynamics, and as a byproduct, samples from the target distribution.
We illustrate the benefits and applicability of SLIPS on several benchmarks of multi-modal distributions, including mixtures in increasing dimensions, logistic regression and high-dimensional field system from statistical-mechanics.
arXiv Detail & Related papers (2024-02-16T15:28:41Z) - Approximation of group explainers with coalition structure using Monte Carlo sampling on the product space of coalitions and features [0.11184789007828977]
We focus on a wide class of linear game values, as well as coalitional values, for the marginal game based on a given ML model and predictor vector.
We design a novel Monte Carlo sampling algorithm that estimates them at a reduced complexity that depends linearly on the size of the background dataset.
arXiv Detail & Related papers (2023-03-17T19:17:06Z) - Low-variance estimation in the Plackett-Luce model via quasi-Monte Carlo
sampling [58.14878401145309]
We develop a novel approach to producing more sample-efficient estimators of expectations in the PL model.
We illustrate our findings both theoretically and empirically using real-world recommendation data from Amazon Music and the Yahoo learning-to-rank challenge.
arXiv Detail & Related papers (2022-05-12T11:15:47Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Quantum self-learning Monte Carlo with quantum Fourier transform sampler [1.961783412203541]
This paper provides a new self-learning Monte Carlo method that utilizes a quantum computer to output a proposal distribution.
The performance of this "quantum inspired" algorithm is demonstrated by some numerical simulations.
arXiv Detail & Related papers (2020-05-28T15:16:00Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.