Model Evidence with Fast Tree Based Quadrature
- URL: http://arxiv.org/abs/2005.11300v1
- Date: Fri, 22 May 2020 17:48:06 GMT
- Title: Model Evidence with Fast Tree Based Quadrature
- Authors: Thomas Foster, Chon Lok Lei, Martin Robinson, David Gavaghan, Ben
Lambert
- Abstract summary: We present a new algorithm called Tree Quadrature (TQ)
TQ places no qualifications on how the samples provided to it are obtained, allowing it to use state-of-the-art sampling algorithms.
On a set of benchmark problems, we show that TQ provides accurate approximations to integrals in up to 15 dimensions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: High dimensional integration is essential to many areas of science, ranging
from particle physics to Bayesian inference. Approximating these integrals is
hard, due in part to the difficulty of locating and sampling from regions of
the integration domain that make significant contributions to the overall
integral. Here, we present a new algorithm called Tree Quadrature (TQ) that
separates this sampling problem from the problem of using those samples to
produce an approximation of the integral. TQ places no qualifications on how
the samples provided to it are obtained, allowing it to use state-of-the-art
sampling algorithms that are largely ignored by existing integration
algorithms. Given a set of samples, TQ constructs a surrogate model of the
integrand in the form of a regression tree, with a structure optimised to
maximise integral precision. The tree divides the integration domain into
smaller containers, which are individually integrated and aggregated to
estimate the overall integral. Any method can be used to integrate each
individual container, so existing integration methods, like Bayesian Monte
Carlo, can be combined with TQ to boost their performance. On a set of
benchmark problems, we show that TQ provides accurate approximations to
integrals in up to 15 dimensions; and in dimensions 4 and above, it outperforms
simple Monte Carlo and the popular Vegas method.
Related papers
- Harmonic Path Integral Diffusion [0.4527270266697462]
We present a novel approach for sampling from a continuous multivariate probability distribution.
Our method constructs a time-dependent bridge from a delta function centered at the origin of the state space.
arXiv Detail & Related papers (2024-09-23T16:20:21Z) - Neural Control Variates with Automatic Integration [49.91408797261987]
This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
arXiv Detail & Related papers (2024-09-23T06:04:28Z) - Efficient Numerical Integration in Reproducing Kernel Hilbert Spaces via
Leverage Scores Sampling [16.992480926905067]
We consider the problem of approximating integrals with respect to a target probability measure using only pointwise evaluations of the integrand.
We propose an efficient procedure which exploits a small i.i.d. random subset of $mn$ samples drawn either uniformly or using approximate leverage scores from the initial observations.
arXiv Detail & Related papers (2023-11-22T17:44:18Z) - Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and
Optimal Algorithms [64.10576998630981]
We show the first tight characterization of the optimal Hessian-dependent sample complexity.
A Hessian-independent algorithm universally achieves the optimal sample complexities for all Hessian instances.
The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions.
arXiv Detail & Related papers (2023-06-21T17:03:22Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - Neural Inverse Transform Sampler [4.061135251278187]
We show that when modeling conditional densities with a neural network, $Z$ can be exactly and efficiently computed.
We introduce the textbfNeural Inverse Transform Sampler (NITS), a novel deep learning framework for modeling and sampling from general, multidimensional, compactly-supported probability densities.
arXiv Detail & Related papers (2022-06-22T15:28:29Z) - Revisiting the Sample Complexity of Sparse Spectrum Approximation of
Gaussian Processes [60.479499225746295]
We introduce a new scalable approximation for Gaussian processes with provable guarantees which hold simultaneously over its entire parameter space.
Our approximation is obtained from an improved sample complexity analysis for sparse spectrum Gaussian processes (SSGPs)
arXiv Detail & Related papers (2020-11-17T05:41:50Z) - Quantum Speedup of Monte Carlo Integration with respect to the Number of
Dimensions and its Application to Finance [0.0]
In Monte Carlo integration, many random numbers are used for calculation of the integrand.
In this paper, we point out that we can reduce the number of such repeated operations by a combination of the nested QAE and the use of pseudorandom numbers.
We pick up one use case of this method in finance, the credit portfolio risk measurement, and estimate to what extent the complexity is reduced.
arXiv Detail & Related papers (2020-11-04T07:40:20Z) - Extending the average spectrum method: Grid points sampling and density
averaging [0.0]
We show that sampling the grid points, instead of keeping them fixed, also changes the functional integral limit.
The remaining bias depends mainly on the width of the grid density, so we go one step further and average also over densities of different widths.
arXiv Detail & Related papers (2020-04-02T17:25:51Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Distributed, partially collapsed MCMC for Bayesian Nonparametrics [68.5279360794418]
We exploit the fact that completely random measures, which commonly used models like the Dirichlet process and the beta-Bernoulli process can be expressed as, are decomposable into independent sub-measures.
We use this decomposition to partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components.
The resulting hybrid algorithm can be applied to allow scalable inference without sacrificing convergence guarantees.
arXiv Detail & Related papers (2020-01-15T23:10:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.