Hamiltonian Monte Carlo Particle Swarm Optimizer
- URL: http://arxiv.org/abs/2206.14134v1
- Date: Sun, 8 May 2022 04:47:34 GMT
- Title: Hamiltonian Monte Carlo Particle Swarm Optimizer
- Authors: Omatharv Bharat Vaidya (1), Rithvik Terence DSouza (1), Snehanshu Saha
(1), Soma Dhavala (2), Swagatam Das (3), ((1)-BITS Pilani K K Birla Goa
Campus, (2)-MlSqaure Bangalore, (3)- ISI Kolkata)
- Abstract summary: Hamiltonian Particle Swarm (HMCPSO) is an optimization algorithm that reaps the benefits of both Exponentially Average sampling and HMC position and velocity.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce the Hamiltonian Monte Carlo Particle Swarm Optimizer (HMC-PSO),
an optimization algorithm that reaps the benefits of both Exponentially
Averaged Momentum PSO and HMC sampling. The coupling of the position and
velocity of each particle with Hamiltonian dynamics in the simulation allows
for extensive freedom for exploration and exploitation of the search space. It
also provides an excellent technique to explore highly non-convex functions
while ensuring efficient sampling. We extend the method to approximate error
gradients in closed form for Deep Neural Network (DNN) settings. We discuss
possible methods of coupling and compare its performance to that of
state-of-the-art optimizers on the Golomb's Ruler problem and Classification
tasks.
Related papers
- Dynamic Anisotropic Smoothing for Noisy Derivative-Free Optimization [0.0]
We propose a novel algorithm that extends the methods of ball smoothing and Gaussian smoothing for noisy derivative-free optimization.
The algorithm dynamically adapts the shape of the smoothing kernel to approximate the Hessian of the objective function around a local optimum.
arXiv Detail & Related papers (2024-05-02T21:04:20Z) - Improving sample efficiency of high dimensional Bayesian optimization
with MCMC [7.241485121318798]
We propose a new method based on Markov Chain Monte Carlo to efficiently sample from an approximated posterior.
We show experimentally that both the Metropolis-Hastings and the Langevin Dynamics version of our algorithm outperform state-of-the-art methods in high-dimensional sequential optimization and reinforcement learning benchmarks.
arXiv Detail & Related papers (2024-01-05T05:56:42Z) - Momentum Particle Maximum Likelihood [2.4561590439700076]
We propose an analogous dynamical-systems-inspired approach to minimizing the free energy functional.
By discretizing the system, we obtain a practical algorithm for Maximum likelihood estimation in latent variable models.
The algorithm outperforms existing particle methods in numerical experiments and compares favourably with other MLE algorithms.
arXiv Detail & Related papers (2023-12-12T14:53:18Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - A Proximal Algorithm for Sampling from Non-smooth Potentials [10.980294435643398]
We propose a novel MCMC algorithm for sampling from non-smooth potentials.
Our method is based on the proximal bundle method and an alternating sampling framework.
One key contribution of this work is a fast algorithm that realizes the restricted Gaussian oracle for any convex non-smooth potential.
arXiv Detail & Related papers (2021-10-09T15:26:07Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - A Swarm Variant for the Schr\"odinger Solver [0.0]
This paper introduces application of the Exponentially Averaged Momentum Particle Swarm Optimization (EM-PSO) as a derivative-free derivative for Neural Networks.
It adopts PSO's major advantages such as search space exploration and higher robustness endowed to local minima compared to gradient-descents such as Adam.
arXiv Detail & Related papers (2021-04-10T15:51:36Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Fast Gravitational Approach for Rigid Point Set Registration with
Ordinary Differential Equations [79.71184760864507]
This article introduces a new physics-based method for rigid point set alignment called Fast Gravitational Approach (FGA)
In FGA, the source and target point sets are interpreted as rigid particle swarms with masses interacting in a globally multiply-linked manner while moving in a simulated gravitational force field.
We show that the new method class has characteristics not found in previous alignment methods.
arXiv Detail & Related papers (2020-09-28T15:05:39Z) - Sequential Subspace Search for Functional Bayesian Optimization
Incorporating Experimenter Intuition [63.011641517977644]
Our algorithm generates a sequence of finite-dimensional random subspaces of functional space spanned by a set of draws from the experimenter's Gaussian Process.
Standard Bayesian optimisation is applied on each subspace, and the best solution found used as a starting point (origin) for the next subspace.
We test our algorithm in simulated and real-world experiments, namely blind function matching, finding the optimal precipitation-strengthening function for an aluminium alloy, and learning rate schedule optimisation for deep networks.
arXiv Detail & Related papers (2020-09-08T06:54:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.