On the Kullback-Leibler divergence between pairwise isotropic
Gaussian-Markov random fields
- URL: http://arxiv.org/abs/2203.13164v1
- Date: Thu, 24 Mar 2022 16:37:24 GMT
- Title: On the Kullback-Leibler divergence between pairwise isotropic
Gaussian-Markov random fields
- Authors: Alexandre L. M. Levada
- Abstract summary: We derive expressions for the Kullback-Leibler divergence between two pairwise isotropic Gaussian-Markov random fields.
The proposed equation allows the development of novel similarity measures in image processing and machine learning applications.
- Score: 93.35534658875731
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Kullback-Leibler divergence or relative entropy is an
information-theoretic measure between statistical models that play an important
role in measuring a distance between random variables. In the study of complex
systems, random fields are mathematical structures that models the interaction
between these variables by means of an inverse temperature parameter,
responsible for controlling the spatial dependence structure along the field.
In this paper, we derive closed-form expressions for the Kullback-Leibler
divergence between two pairwise isotropic Gaussian-Markov random fields in both
univariate and multivariate cases. The proposed equation allows the development
of novel similarity measures in image processing and machine learning
applications, such as image denoising and unsupervised metric learning.
Related papers
- Relaxation Fluctuations of Correlation Functions: Spin and Random Matrix Models [0.0]
We study the fluctuation average and variance of certain correlation functions as a diagnostic measure of quantum chaos.
We identify the three distinct phases of the models: the ergodic, the fractal, and the localized phases.
arXiv Detail & Related papers (2024-07-31T14:45:46Z) - Kinetic Interacting Particle Langevin Monte Carlo [0.0]
This paper introduces and analyses interacting underdamped Langevin algorithms, for statistical inference in latent variable models.
We propose a diffusion process that evolves jointly in the space of parameters and latent variables.
We provide two explicit discretisations of this diffusion as practical algorithms to estimate parameters of statistical models.
arXiv Detail & Related papers (2024-07-08T09:52:46Z) - Statistical Mechanics of Dynamical System Identification [3.1484174280822845]
We develop a statistical mechanical approach to analyze sparse equation discovery algorithms.
In this framework, statistical mechanics offers tools to analyze the interplay between complexity and fitness.
arXiv Detail & Related papers (2024-03-04T04:32:28Z) - Applications of flow models to the generation of correlated lattice QCD ensembles [69.18453821764075]
Machine-learned normalizing flows can be used in the context of lattice quantum field theory to generate statistically correlated ensembles of lattice gauge fields at different action parameters.
This work demonstrates how these correlations can be exploited for variance reduction in the computation of observables.
arXiv Detail & Related papers (2024-01-19T18:33:52Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Equivariance Discovery by Learned Parameter-Sharing [153.41877129746223]
We study how to discover interpretable equivariances from data.
Specifically, we formulate this discovery process as an optimization problem over a model's parameter-sharing schemes.
Also, we theoretically analyze the method for Gaussian data and provide a bound on the mean squared gap between the studied discovery scheme and the oracle scheme.
arXiv Detail & Related papers (2022-04-07T17:59:19Z) - Moment evolution equations and moment matching for stochastic image
EPDiff [68.97335984455059]
Models of image deformation allow study of time-continuous effects transforming images by deforming the image domain.
Applications include medical image analysis with both population trends and random subject specific variation.
We use moment approximations of the corresponding Ito diffusion to construct estimators for statistical inference in the parameters full model.
arXiv Detail & Related papers (2021-10-07T11:08:11Z) - Coarse-grained and emergent distributed parameter systems from data [0.6117371161379209]
We derivation of PDEs from computation system data.
In particular, we focus here on the use of manifold learning techniques.
We demonstrate each approach through an established PDE example.
arXiv Detail & Related papers (2020-11-16T18:02:01Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z) - The entanglement membrane in chaotic many-body systems [0.0]
In certain analytically-tractable quantum chaotic systems, the calculation of out-of-time-order correlation functions, entanglement entropies after a quench, and other related dynamical observables, reduces to an effective theory of an entanglement membrane'' in spacetime.
We show here how to make sense of this membrane in more realistic models, which do not involve an average over random unitaries.
arXiv Detail & Related papers (2019-12-27T19:01:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.