Monte Carlo generation of localised particle trajectories
- URL: http://arxiv.org/abs/2304.10518v2
- Date: Thu, 5 Oct 2023 16:32:08 GMT
- Title: Monte Carlo generation of localised particle trajectories
- Authors: Ivan Ahumada and James P. Edwards
- Abstract summary: We introduce modifications to Monte Carlo simulations of the Feynman path integral that improve sampling of localised interactions.
The new algorithms generate trajectories in simple background potentials designed to concentrate them about the interaction region.
This improves statistical sampling of the system and overcomes a long-time "undersampling problem" caused by the spatial diffusion inherent in Brownian motion.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce modifications to Monte Carlo simulations of the Feynman path
integral that improve sampling of localised interactions. The new algorithms
generate trajectories in simple background potentials designed to concentrate
them about the interaction region, reminiscent of importance sampling. This
improves statistical sampling of the system and overcomes a long-time
"undersampling problem" caused by the spatial diffusion inherent in Brownian
motion. We prove the validity of our approach using previous analytic work on
the distribution of values of the Wilson line over path integral trajectories
and illustrate the improvements on some simple quantum mechanical systems
Related papers
- Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - AdamMCMC: Combining Metropolis Adjusted Langevin with Momentum-based Optimization [0.0]
Uncertainty estimation is a key issue when considering the application of deep neural network methods in science and engineering.
We introduce a novel algorithm that quantifies uncertainty via Monte Carlo sampling from a tempered posterior distribution.
arXiv Detail & Related papers (2023-12-21T16:58:49Z) - Symmetric Mean-field Langevin Dynamics for Distributional Minimax
Problems [78.96969465641024]
We extend mean-field Langevin dynamics to minimax optimization over probability distributions for the first time with symmetric and provably convergent updates.
We also study time and particle discretization regimes and prove a new uniform-in-time propagation of chaos result.
arXiv Detail & Related papers (2023-12-02T13:01:29Z) - Repelling Random Walks [42.75616308187867]
We present a novel quasi-Monte Carlo mechanism to improve graph-based sampling, coined repelling random walks.
We showcase the effectiveness of repelling random walks in a range of settings including estimation of graph kernels, the PageRank vector and graphlet concentrations.
To our knowledge, repelling random walks constitute the first rigorously studied quasi-Monte Carlo scheme correlating the directions of walkers on a graph, inviting new research in this exciting nascent domain.
arXiv Detail & Related papers (2023-10-07T15:30:23Z) - Transport meets Variational Inference: Controlled Monte Carlo Diffusions [5.5654189024307685]
We present a principled and systematic framework for sampling and generative modelling centred around divergences on path space.
Our work culminates in the development of the emphControlled Monte Carlo Diffusion sampler (CMCD) for Bayesian computation.
arXiv Detail & Related papers (2023-07-03T14:28:36Z) - Metropolis Monte Carlo sampling: convergence, localization transition
and optimality [0.0]
We show that deviations from the target steady-state distribution can feature a localization transition.
We argue that the relaxation before and after the localisation transition is respectively limited by diffusion and rejection rates.
arXiv Detail & Related papers (2022-07-21T14:06:04Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.