Mutual Information Estimation via Normalizing Flows
- URL: http://arxiv.org/abs/2403.02187v3
- Date: Sat, 25 May 2024 09:37:21 GMT
- Title: Mutual Information Estimation via Normalizing Flows
- Authors: Ivan Butakov, Alexander Tolmachev, Sofia Malanchuk, Anna Neopryatnaya, Alexey Frolov,
- Abstract summary: We propose a novel approach to the problem of mutual information estimation.
The estimator maps original data to the target distribution, for which MI is easier to estimate.
We additionally explore the target distributions with known closed-form expressions for MI.
- Score: 39.58317527488534
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel approach to the problem of mutual information (MI) estimation via introducing a family of estimators based on normalizing flows. The estimator maps original data to the target distribution, for which MI is easier to estimate. We additionally explore the target distributions with known closed-form expressions for MI. Theoretical guarantees are provided to demonstrate that our approach yields MI estimates for the original data. Experiments with high-dimensional data are conducted to highlight the practical advantages of the proposed method.
Related papers
- Mutual Information Multinomial Estimation [53.58005108981247]
Estimating mutual information (MI) is a fundamental yet challenging task in data science and machine learning.
Our main discovery is that a preliminary estimate of the data distribution can dramatically help estimate.
Experiments on diverse tasks including non-Gaussian synthetic problems with known ground-truth and real-world applications demonstrate the advantages of our method.
arXiv Detail & Related papers (2024-08-18T06:27:30Z) - On the Properties and Estimation of Pointwise Mutual Information Profiles [49.877314063833296]
The pointwise mutual information profile, or simply profile, is the distribution of pointwise mutual information for a given pair of random variables.
We introduce a novel family of distributions, Bend and Mix Models, for which the profile can be accurately estimated using Monte Carlo methods.
arXiv Detail & Related papers (2023-10-16T10:02:24Z) - MINDE: Mutual Information Neural Diffusion Estimation [7.399561232927219]
We present a new method for the estimation of Mutual Information (MI) between random variables.
We use score-based diffusion models to estimate the Kullback Leibler divergence between two densities as a difference between their score functions.
As a by-product, our method also enables the estimation of the entropy of random variables.
arXiv Detail & Related papers (2023-10-13T11:47:41Z) - Beyond Normal: On the Evaluation of Mutual Information Estimators [52.85079110699378]
We show how to construct a diverse family of distributions with known ground-truth mutual information.
We provide guidelines for practitioners on how to select appropriate estimator adapted to the difficulty of problem considered.
arXiv Detail & Related papers (2023-06-19T17:26:34Z) - Improving Mutual Information Estimation with Annealed and Energy-Based
Bounds [20.940022170594816]
Mutual information (MI) is a fundamental quantity in information theory and machine learning.
We present a unifying view of existing MI bounds from the perspective of importance sampling.
We propose three novel bounds based on this approach.
arXiv Detail & Related papers (2023-03-13T10:47:24Z) - STEERING: Stein Information Directed Exploration for Model-Based
Reinforcement Learning [111.75423966239092]
We propose an exploration incentive in terms of the integral probability metric (IPM) between a current estimate of the transition model and the unknown optimal.
Based on KSD, we develop a novel algorithm algo: textbfSTEin information dirtextbfEcted exploration for model-based textbfReinforcement LearntextbfING.
arXiv Detail & Related papers (2023-01-28T00:49:28Z) - Mixture of von Mises-Fisher distribution with sparse prototypes [0.0]
Mixtures of von Mises-Fisher distributions can be used to cluster data on the unit hypersphere.
We propose in this article to estimate a von Mises mixture using a l 1 penalized likelihood.
arXiv Detail & Related papers (2022-12-30T08:00:38Z) - Learning Bias-Invariant Representation by Cross-Sample Mutual
Information Minimization [77.8735802150511]
We propose a cross-sample adversarial debiasing (CSAD) method to remove the bias information misused by the target task.
The correlation measurement plays a critical role in adversarial debiasing and is conducted by a cross-sample neural mutual information estimator.
We conduct thorough experiments on publicly available datasets to validate the advantages of the proposed method over state-of-the-art approaches.
arXiv Detail & Related papers (2021-08-11T21:17:02Z) - Meta-Learning Conjugate Priors for Few-Shot Bayesian Optimization [0.0]
We propose a novel approach to utilize meta-learning to automate the estimation of informative conjugate prior distributions.
From this process we generate priors that require only few data to estimate the shape parameters of the original distribution of the data.
arXiv Detail & Related papers (2021-01-03T23:58:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.