Exploring Maximum Entropy Distributions with Evolutionary Algorithms
- URL: http://arxiv.org/abs/2002.01973v1
- Date: Wed, 5 Feb 2020 19:52:05 GMT
- Title: Exploring Maximum Entropy Distributions with Evolutionary Algorithms
- Authors: Raul Rojas
- Abstract summary: We show how to evolve numerically the maximum entropy probability distributions for a given set of constraints.
An evolutionary algorithm can obtain approximations to some well-known analytical results.
We explain why many of the distributions are symmetrical and continuous, but some are not.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper shows how to evolve numerically the maximum entropy probability
distributions for a given set of constraints, which is a variational calculus
problem. An evolutionary algorithm can obtain approximations to some well-known
analytical results, but is even more flexible and can find distributions for
which a closed formula cannot be readily stated. The numerical approach handles
distributions over finite intervals. We show that there are two ways of
conducting the procedure: by direct optimization of the Lagrangian of the
constrained problem, or by optimizing the entropy among the subset of
distributions which fulfill the constraints. An incremental evolutionary
strategy easily obtains the uniform, the exponential, the Gaussian, the
log-normal, the Laplace, among other distributions, once the constrained
problem is solved with any of the two methods. Solutions for mixed ("chimera")
distributions can be also found. We explain why many of the distributions are
symmetrical and continuous, but some are not.
Related papers
- Non-asymptotic Convergence of Discrete-time Diffusion Models: New Approach and Improved Rate [49.97755400231656]
We establish convergence guarantees for substantially larger classes of distributions under DT diffusion processes.
We then specialize our results to a number of interesting classes of distributions with explicit parameter dependencies.
We propose a novel accelerated sampler and show that it improves the convergence rates of the corresponding regular sampler by orders of magnitude with respect to all system parameters.
arXiv Detail & Related papers (2024-02-21T16:11:47Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Gaussian Process Regression for Maximum Entropy Distribution [0.0]
We investigate the suitability of Gaussian priors to approximate the Lagrange multipliers as a map of a given set of moments.
The performance of the devised data-driven Maximum-Entropy closure is studied for couple of test cases.
arXiv Detail & Related papers (2023-08-11T14:26:29Z) - Approximating a RUM from Distributions on k-Slates [88.32814292632675]
We find a generalization-time algorithm that finds the RUM that best approximates the given distribution on average.
Our theoretical result can also be made practical: we obtain a that is effective and scales to real-world datasets.
arXiv Detail & Related papers (2023-05-22T17:43:34Z) - Efficient Informed Proposals for Discrete Distributions via Newton's
Series Approximation [13.349005662077403]
We develop a gradient-like proposal for any discrete distribution without a strong requirement.
Our method efficiently approximates the discrete likelihood ratio via Newton's series expansion.
We prove that our method has a guaranteed convergence rate with or without the Metropolis-Hastings step.
arXiv Detail & Related papers (2023-02-27T16:28:23Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Evidential Softmax for Sparse Multimodal Distributions in Deep
Generative Models [38.26333732364642]
We present $textitev-softmax$, a sparse normalization function that preserves the multimodality of probability distributions.
We evaluate our method on a variety of generative models, including variational autoencoders and auto-regressive architectures.
arXiv Detail & Related papers (2021-10-27T05:32:25Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Stochastic Saddle-Point Optimization for Wasserstein Barycenters [69.68068088508505]
We consider the populationimation barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data.
We employ the structure of the problem and obtain a convex-concave saddle-point reformulation of this problem.
In the setting when the distribution of random probability measures is discrete, we propose an optimization algorithm and estimate its complexity.
arXiv Detail & Related papers (2020-06-11T19:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.