Multilevel Generative Samplers for Investigating Critical Phenomena
- URL: http://arxiv.org/abs/2503.08918v2
- Date: Thu, 13 Mar 2025 14:13:52 GMT
- Title: Multilevel Generative Samplers for Investigating Critical Phenomena
- Authors: Ankur Singha, Elia Cellini, Kim A. Nicoli, Karl Jansen, Stefan Kühn, Shinichi Nakajima,
- Abstract summary: Long-range correlations cause critical slowing down in Markov chain Monte Carlo.<n>We propose a novel sampler specialized for near-critical systems.<n>We show that the effective sample size RiGCS is few orders of magnitude higher than state-of-the-art generative model baselines.
- Score: 3.8160065878097797
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Investigating critical phenomena or phase transitions is of high interest in physics and chemistry, for which Monte Carlo (MC) simulations, a crucial tool for numerically analyzing macroscopic properties of given systems, are often hindered by an emerging divergence of correlation length -- known as scale invariance at criticality (SIC) in the renormalization group theory. SIC causes the system to behave the same at any length scale, from which many existing sampling methods suffer: long-range correlations cause critical slowing down in Markov chain Monte Carlo (MCMC), and require intractably large receptive fields for generative samplers. In this paper, we propose a Renormalization-informed Generative Critical Sampler (RiGCS) -- a novel sampler specialized for near-critical systems, where SIC is leveraged as an advantage rather than a nuisance. Specifically, RiGCS builds on MultiLevel Monte Carlo (MLMC) with Heat Bath (HB) algorithms, which perform ancestral sampling from low-resolution to high-resolution lattice configurations with site-wise-independent conditional HB sampling. Although MLMC-HB is highly efficient under exact SIC, it suffers from a low acceptance rate under slight SIC violation. Notably, SIC violation always occurs in finite-size systems, and may induce long-range and higher-order interactions in the renormalized distributions, which are not considered by independent HB samplers. RiGCS enhances MLMC-HB by replacing a part of the conditional HB sampler with generative models that capture those residual interactions and improve the sampling efficiency. Our experiments show that the effective sample size of RiGCS is a few orders of magnitude higher than state-of-the-art generative model baselines in sampling configurations for 128x128 two-dimensional Ising systems.
Related papers
- Parameter Expanded Stochastic Gradient Markov Chain Monte Carlo [32.46884330460211]
We propose a simple yet effective approach to enhance sample diversity in Gradient Markov Chain Monte Carlo.
This approach produces a more diverse set of samples, allowing faster mixing within the same computational budget.
Our experiments on image classification tasks, including OOD robustness, diversity, loss surface analyses, and a comparative study with Hamiltonian Monte Carlo, demonstrate the superiority of the proposed approach.
arXiv Detail & Related papers (2025-03-02T02:42:50Z) - Scalable Equilibrium Sampling with Sequential Boltzmann Generators [60.00515282300297]
We extend the Boltzmann generator framework and introduce Sequential Boltzmann generators with two key improvements.
The first is a highly efficient non-equivariant Transformer-based normalizing flow operating directly on all-atom Cartesian coordinates.
We demonstrate the first equilibrium sampling in Cartesian coordinates of tri, tetra, and hexapeptides that were so far intractable for prior Boltzmann generators.
arXiv Detail & Related papers (2025-02-25T18:59:13Z) - Persistent Sampling: Enhancing the Efficiency of Sequential Monte Carlo [0.0]
Sequential Monte Carlo (SMC) samplers are powerful tools for Bayesian inference but suffer from high computational costs.<n>We introduce persistent sampling (PS), which retains SMC and constructs particles from all prior iterations.
arXiv Detail & Related papers (2024-07-30T10:34:40Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Unbiasing time-dependent Variational Monte Carlo by projected quantum
evolution [44.99833362998488]
We analyze the accuracy and sample complexity of variational Monte Carlo approaches to simulate quantum systems classically.
We prove that the most used scheme, the time-dependent Variational Monte Carlo (tVMC), is affected by a systematic statistical bias.
We show that a different scheme based on the solution of an optimization problem at each time step is free from such problems.
arXiv Detail & Related papers (2023-05-23T17:38:10Z) - Hard-normal Example-aware Template Mutual Matching for Industrial Anomaly Detection [78.734927709231]
Anomaly detectors are widely used in industrial manufacturing to detect and localize unknown defects in query images.<n>These detectors are trained on anomaly-free samples and have successfully distinguished anomalies from most normal samples.<n>However, hard-normal examples are scattered and far apart from most normal samples, and thus they are often mistaken for anomalies by existing methods.
arXiv Detail & Related papers (2023-03-28T17:54:56Z) - GANs and Closures: Micro-Macro Consistency in Multiscale Modeling [0.0]
We present an approach that couples physics-based simulations and biasing methods for sampling conditional distributions with Machine Learning-based conditional generative adversarial networks.
We show that this framework can improve multiscale SDE dynamical systems sampling, and even shows promise for systems of increasing complexity.
arXiv Detail & Related papers (2022-08-23T03:45:39Z) - Reconstructing the Universe with Variational self-Boosted Sampling [7.922637707393503]
Traditional algorithms such as Hamiltonian Monte Carlo (HMC) are computationally inefficient due to generating correlated samples.
Here we develop a hybrid scheme called variational self-boosted sampling (VBS) to mitigate the drawbacks of both algorithms.
VBS generates better quality of samples than simple VI approaches and reduces the correlation length in the sampling phase by a factor of 10-50 over using only HMC.
arXiv Detail & Related papers (2022-06-28T21:30:32Z) - Stereographic Markov Chain Monte Carlo [2.9304381683255945]
High-dimensional distributions are notoriously difficult for off-the-shelf MCMC samplers.
We introduce a new class of MCMC samplers that map the original high-dimensional problem in Euclidean space onto a sphere.
In the best scenario, the proposed samplers can enjoy the blessings of dimensionality'' that the convergence is faster in higher dimensions.
arXiv Detail & Related papers (2022-05-24T14:43:23Z) - What Are Bayesian Neural Network Posteriors Really Like? [63.950151520585024]
We show that Hamiltonian Monte Carlo can achieve significant performance gains over standard and deep ensembles.
We also show that deep distributions are similarly close to HMC as standard SGLD, and closer than standard variational inference.
arXiv Detail & Related papers (2021-04-29T15:38:46Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Targeted stochastic gradient Markov chain Monte Carlo for hidden Markov models with rare latent states [48.705095800341944]
Markov chain Monte Carlo (MCMC) algorithms for hidden Markov models often rely on the forward-backward sampler.
This makes them computationally slow as the length of the time series increases, motivating the development of sub-sampling-based approaches.
We propose a targeted sub-sampling approach that over-samples observations corresponding to rare latent states when calculating the gradient of parameters associated with them.
arXiv Detail & Related papers (2018-10-31T17:44:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.