Understanding the Behavior of Belief Propagation
- URL: http://arxiv.org/abs/2209.05464v1
- Date: Mon, 5 Sep 2022 14:24:52 GMT
- Title: Understanding the Behavior of Belief Propagation
- Authors: Christian Knoll
- Abstract summary: This thesis investigates how the model parameters influence the performance of belief propagation.
We are particularly interested in their influence on (i) the number of fixed points, (ii) the convergence properties, and (iii) the approximation quality.
- Score: 0.7614628596146599
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Probabilistic graphical models are a powerful concept for modeling
high-dimensional distributions. Besides modeling distributions, probabilistic
graphical models also provide an elegant framework for performing statistical
inference; because of the high-dimensional nature, however, one must often use
approximate methods for this purpose. Belief propagation performs approximate
inference, is efficient, and looks back on a long success-story. Yet, in most
cases, belief propagation lacks any performance and convergence guarantees.
Many realistic problems are presented by graphical models with loops, however,
in which case belief propagation is neither guaranteed to provide accurate
estimates nor that it converges at all. This thesis investigates how the model
parameters influence the performance of belief propagation. We are particularly
interested in their influence on (i) the number of fixed points, (ii) the
convergence properties, and (iii) the approximation quality.
Related papers
- Robust Weighted Triangulation of Causal Effects Under Model Uncertainty [2.1793134762413433]
We develop a framework for causal effect triangulation that combines model testability methods with statistical inference methods.<n>We provide a bound on the distance of the functional from the true causal effect along with conditions under which this distance can be taken to zero.<n>Our framework formalizes robustness under causal pluralism without requiring agreement across models or commitment to a single specification.
arXiv Detail & Related papers (2026-03-01T14:09:34Z) - Bias and Identifiability in the Bounded Confidence Model [4.660328753262075]
bounded confidence models describe how a population can reach consensus, fragmentation, or polarization.<n> estimation of model parameters is a key aspect, and maximum likelihood estimation provides a principled way to tackle it.<n>Our results show how the analysis of the likelihood function is a fruitful approach for better understanding the pitfalls and possibilities of estimating the parameters of opinion dynamics models.
arXiv Detail & Related papers (2025-06-13T13:04:29Z) - Statistical Inference for Generative Model Comparison [6.653749938600871]
We propose a method to compare two generative models with statistical confidence based on an unbiased estimator of their relative performance gap.<n>Theoretically, our estimator achieves parametric convergence rates and admits normality, which enables valid inference.
arXiv Detail & Related papers (2025-01-31T05:31:05Z) - Identifiable Latent Neural Causal Models [82.14087963690561]
Causal representation learning seeks to uncover latent, high-level causal representations from low-level observed data.
We determine the types of distribution shifts that do contribute to the identifiability of causal representations.
We translate our findings into a practical algorithm, allowing for the acquisition of reliable latent causal representations.
arXiv Detail & Related papers (2024-03-23T04:13:55Z) - A performance characteristic curve for model evaluation: the application
in information diffusion prediction [3.8711489380602804]
We propose a metric based on information entropy to quantify the randomness in diffusion data, then identify a scaling pattern between the randomness and the prediction accuracy of the model.
Data points in the patterns by different sequence lengths, system sizes, and randomness all collapse into a single curve, capturing a model's inherent capability of making correct predictions.
The validity of the curve is tested by three prediction models in the same family, reaching conclusions in line with existing studies.
arXiv Detail & Related papers (2023-09-18T07:32:57Z) - Advancing Counterfactual Inference through Nonlinear Quantile Regression [77.28323341329461]
We propose a framework for efficient and effective counterfactual inference implemented with neural networks.
The proposed approach enhances the capacity to generalize estimated counterfactual outcomes to unseen data.
Empirical results conducted on multiple datasets offer compelling support for our theoretical assertions.
arXiv Detail & Related papers (2023-06-09T08:30:51Z) - Modeling Uncertain Feature Representation for Domain Generalization [49.129544670700525]
We show that our method consistently improves the network generalization ability on multiple vision tasks.
Our methods are simple yet effective and can be readily integrated into networks without additional trainable parameters or loss constraints.
arXiv Detail & Related papers (2023-01-16T14:25:02Z) - BayesFlow can reliably detect Model Misspecification and Posterior
Errors in Amortized Bayesian Inference [0.0]
We conceptualize the types of model misspecification arising in simulation-based inference and systematically investigate the performance of the BayesFlow framework under these misspecifications.
We propose an augmented optimization objective which imposes a probabilistic structure on the latent data space and utilize maximum mean discrepancy (MMD) to detect potentially catastrophic misspecifications.
arXiv Detail & Related papers (2021-12-16T13:25:27Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - PSD Representations for Effective Probability Models [117.35298398434628]
We show that a recently proposed class of positive semi-definite (PSD) models for non-negative functions is particularly suited to this end.
We characterize both approximation and generalization capabilities of PSD models, showing that they enjoy strong theoretical guarantees.
Our results open the way to applications of PSD models to density estimation, decision theory and inference.
arXiv Detail & Related papers (2021-06-30T15:13:39Z) - The Evolution of Out-of-Distribution Robustness Throughout Fine-Tuning [25.85044477227461]
Models that are more accurate on the out-of-distribution data relative to this baseline exhibit "effective robustness"
We find that models pre-trained on larger datasets exhibit effective robustness during training that vanishes at convergence.
We discuss several strategies for scaling effective robustness to the high-accuracy regime to improve the out-of-distribution accuracy of state-of-the-art models.
arXiv Detail & Related papers (2021-06-30T06:21:42Z) - Learning Accurate Dense Correspondences and When to Trust Them [161.76275845530964]
We aim to estimate a dense flow field relating two images, coupled with a robust pixel-wise confidence map.
We develop a flexible probabilistic approach that jointly learns the flow prediction and its uncertainty.
Our approach obtains state-of-the-art results on challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-01-05T18:54:11Z) - Robust Bayesian Inference for Discrete Outcomes with the Total Variation
Distance [5.139874302398955]
Models of discrete-valued outcomes are easily misspecified if the data exhibit zero-inflation, overdispersion or contamination.
Here, we introduce a robust discrepancy-based Bayesian approach using the Total Variation Distance (TVD)
We empirically demonstrate that our approach is robust and significantly improves predictive performance on a range of simulated and real world data.
arXiv Detail & Related papers (2020-10-26T09:53:06Z) - Modeling Score Distributions and Continuous Covariates: A Bayesian
Approach [8.772459063453285]
We develop a generative model of the match and non-match score distributions over continuous covariates.
We use mixture models to capture arbitrary distributions and local basis functions.
Three experiments demonstrate the accuracy and effectiveness of our approach.
arXiv Detail & Related papers (2020-09-21T02:41:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.