Exact Asymptotics for Learning Tree-Structured Graphical Models with
Side Information: Noiseless and Noisy Samples
- URL: http://arxiv.org/abs/2005.04354v1
- Date: Sat, 9 May 2020 02:42:40 GMT
- Title: Exact Asymptotics for Learning Tree-Structured Graphical Models with
Side Information: Noiseless and Noisy Samples
- Authors: Anshoo Tandon and Vincent Y. F. Tan and Shiyao Zhu
- Abstract summary: We derive the exacts of learning an Ising tree-structured graphical model from independently drawn samples.
We extend our results to the scenario in which the samples are observed in random noise.
Our theoretical results demonstrate keen agreement with experimental results for sample sizes as small as that in the hundreds.
- Score: 75.97774982432976
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Given side information that an Ising tree-structured graphical model is
homogeneous and has no external field, we derive the exact asymptotics of
learning its structure from independently drawn samples. Our results, which
leverage the use of probabilistic tools from the theory of strong large
deviations, refine the large deviation (error exponents) results of Tan,
Anandkumar, Tong, and Willsky [IEEE Trans. on Inform. Th., 57(3):1714--1735,
2011] and strictly improve those of Bresler and Karzand [Ann. Statist., 2020].
In addition, we extend our results to the scenario in which the samples are
observed in random noise. In this case, we show that they strictly improve on
the recent results of Nikolakakis, Kalogerias, and Sarwate [Proc. AISTATS,
1771--1782, 2019]. Our theoretical results demonstrate keen agreement with
experimental results for sample sizes as small as that in the hundreds.
Related papers
- Evaluating the design space of diffusion-based generative models [21.483299796597404]
This article provides a first quantitative understanding of the whole generation process.
It conducts a non-asymptotic convergence analysis of denoising score matching under gradient descent.
A refined sampling error analysis for variance exploding models is also provided.
arXiv Detail & Related papers (2024-06-18T17:56:10Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Learning Causal Graphs via Monotone Triangular Transport Maps [1.6752182911522522]
We study the problem of causal structure learning from data using optimal transport (OT)
We provide an algorithm for causal discovery up to Markov Equivalence with no assumptions on the structural equations/noise distributions.
We provide experimental results to compare the proposed approach with the state of the art on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-26T13:24:17Z) - Preserving Knowledge Invariance: Rethinking Robustness Evaluation of
Open Information Extraction [50.62245481416744]
We present the first benchmark that simulates the evaluation of open information extraction models in the real world.
We design and annotate a large-scale testbed in which each example is a knowledge-invariant clique.
By further elaborating the robustness metric, a model is judged to be robust if its performance is consistently accurate on the overall cliques.
arXiv Detail & Related papers (2023-05-23T12:05:09Z) - Decentralized Learning of Tree-Structured Gaussian Graphical Models from
Noisy Data [1.52292571922932]
This paper studies the decentralized learning of tree-structured Gaussian graphical models (GGMs) from noisy data.
GGMs are widely used to model complex networks such as gene regulatory networks and social networks.
The proposed decentralized learning uses the Chow-Liu algorithm for estimating the tree-structured GGM.
arXiv Detail & Related papers (2021-09-22T10:41:18Z) - SGA: A Robust Algorithm for Partial Recovery of Tree-Structured
Graphical Models with Noisy Samples [75.32013242448151]
We consider learning Ising tree models when the observations from the nodes are corrupted by independent but non-identically distributed noise.
Katiyar et al. (2020) showed that although the exact tree structure cannot be recovered, one can recover a partial tree structure.
We propose Symmetrized Geometric Averaging (SGA), a more statistically robust algorithm for partial tree recovery.
arXiv Detail & Related papers (2021-01-22T01:57:35Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Handling Missing Data in Decision Trees: A Probabilistic Approach [41.259097100704324]
We tackle the problem of handling missing data in decision trees by taking a probabilistic approach.
We use tractable density estimators to compute the "expected prediction" of our models.
At learning time, we fine-tune parameters of already learned trees by minimizing their "expected prediction loss"
arXiv Detail & Related papers (2020-06-29T19:54:54Z) - Denoising Diffusion Probabilistic Models [91.94962645056896]
We present high quality image synthesis results using diffusion probabilistic models.
Our best results are obtained by training on a weighted variational bound designed according to a novel connection between diffusion probabilistic models and denoising score matching with Langevin dynamics.
arXiv Detail & Related papers (2020-06-19T17:24:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.