Composite Spatial Monte Carlo Integration Based on Generalized Least
Squares
- URL: http://arxiv.org/abs/2204.03248v1
- Date: Thu, 7 Apr 2022 06:35:13 GMT
- Title: Composite Spatial Monte Carlo Integration Based on Generalized Least
Squares
- Authors: Kaiji Sekimoto, Muneki Yasuda
- Abstract summary: Spatial Monte Carlo integration (SMCI) is a sampling-based approximation.
A new effective method is proposed by combining multiple SMCI estimators.
The results indicate that the proposed method can be effective in the inverse Ising problem (or Boltzmann machine learning)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Although evaluation of the expectations on the Ising model is essential in
various applications, this is frequently infeasible because of intractable
multiple summations (or integrations). Spatial Monte Carlo integration (SMCI)
is a sampling-based approximation, and can provide high-accuracy estimations
for such intractable expectations. To evaluate the expectation of a function of
variables in a specific region (called target region), SMCI considers a larger
region containing the target region (called sum region). In SMCI, the multiple
summation for the variables in the sum region is precisely executed, and that
in the outer region is evaluated by the sampling approximation such as the
standard Monte Carlo integration. It is guaranteed that the accuracy of the
SMCI estimator is monotonically improved as the size of the sum region
increases. However, a haphazard expansion of the sum region could cause a
combinatorial explosion. Therefore, we hope to improve the accuracy without
such region expansion. In this study, based on the theory of generalized least
squares, a new effective method is proposed by combining multiple SMCI
estimators. The validity of the proposed method is demonstrated theoretically
and numerically. The results indicate that the proposed method can be effective
in the inverse Ising problem (or Boltzmann machine learning).
Related papers
- Mean field initialization of the Annealed Importance Sampling algorithm for an efficient evaluation of the Partition Function of Restricted Boltzmann Machines [0.0]
Annealed Importance Sampling (AIS) is a tool to estimate the partition function of a system.
We show that both the quality of the estimation and the cost of the computation can be significantly improved by using a properly selected mean-field starting probability distribution.
We conclude that these are good starting points to estimate the partition function with AIS with a relatively low computational cost.
arXiv Detail & Related papers (2024-04-17T10:22:03Z) - Less is More: Fewer Interpretable Region via Submodular Subset Selection [54.07758302264416]
This paper re-models the above image attribution problem as a submodular subset selection problem.
We construct a novel submodular function to discover more accurate small interpretation regions.
For correctly predicted samples, the proposed method improves the Deletion and Insertion scores with an average of 4.9% and 2.5% gain relative to HSIC-Attribution.
arXiv Detail & Related papers (2024-02-14T13:30:02Z) - Efficient Numerical Integration in Reproducing Kernel Hilbert Spaces via
Leverage Scores Sampling [16.992480926905067]
We consider the problem of approximating integrals with respect to a target probability measure using only pointwise evaluations of the integrand.
We propose an efficient procedure which exploits a small i.i.d. random subset of $mn$ samples drawn either uniformly or using approximate leverage scores from the initial observations.
arXiv Detail & Related papers (2023-11-22T17:44:18Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Spatial Monte Carlo Integration with Annealed Importance Sampling [0.45687771576879593]
A new method is proposed to evaluate the expectations on Ising models combining AIS and SMCI.
The proposed method performs efficiently in both high- and low-temperature regions.
arXiv Detail & Related papers (2020-12-21T09:26:40Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - A Generalization of Spatial Monte Carlo Integration [0.0]
Spatial Monte Carlo integration (SMCI) is an extension of standard Monte Carlo integration and can approximate expectations on Markov random fields with high accuracy.
A new Boltzmann machine learning method based on SMCI is proposed, which is obtained by combining SMCI and the persistent contrastive divergence.
arXiv Detail & Related papers (2020-09-04T13:02:58Z) - Efficient Evaluation of the Partition Function of RBMs with Annealed
Importance Sampling [0.30458514384586394]
Annealed Importance Sampling (AIS) method provides a tool to estimate the partition function of the system.
We analyze the performance of AIS in both small- and large-sized problems, and show that in both cases a good estimation of Z can be obtained with little computational cost.
arXiv Detail & Related papers (2020-07-23T10:59:04Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.