On the Foundations of Cycles in Bayesian Networks
- URL: http://arxiv.org/abs/2301.08608v1
- Date: Fri, 20 Jan 2023 14:40:17 GMT
- Title: On the Foundations of Cycles in Bayesian Networks
- Authors: Christel Baier and Clemens Dubslaff and Holger Hermanns and Nikolai
K\"afer
- Abstract summary: We present a foundational study regarding semantics for cyclic BNs that are generic and conservatively extend the cycle-free setting.
First, we propose constraint-based semantics that specify requirements for full joint distributions over a BN to be consistent with the local conditional probabilities and independencies.
Second, two kinds of limit semantics that formalize infinite unfolding approaches are introduced and shown to be computable by a Markov chain construction.
- Score: 4.312746668772342
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian networks (BNs) are a probabilistic graphical model widely used for
representing expert knowledge and reasoning under uncertainty. Traditionally,
they are based on directed acyclic graphs that capture dependencies between
random variables. However, directed cycles can naturally arise when
cross-dependencies between random variables exist, e.g., for modeling feedback
loops. Existing methods to deal with such cross-dependencies usually rely on
reductions to BNs without cycles. These approaches are fragile to generalize,
since their justifications are intermingled with additional knowledge about the
application context. In this paper, we present a foundational study regarding
semantics for cyclic BNs that are generic and conservatively extend the
cycle-free setting. First, we propose constraint-based semantics that specify
requirements for full joint distributions over a BN to be consistent with the
local conditional probabilities and independencies. Second, two kinds of limit
semantics that formalize infinite unfolding approaches are introduced and shown
to be computable by a Markov chain construction.
Related papers
- Cyclic quantum causal modelling with a graph separation theorem [0.0]
We introduce a robust probability rule and a novel graph-separation property, p-separation, which we prove to be sound and complete for all such models.
Our approach maps cyclic causal models to acyclic ones with post-selection, leveraging the post-selected quantum teleportation protocol.
arXiv Detail & Related papers (2025-02-06T15:51:15Z) - Structural Entropy Guided Probabilistic Coding [52.01765333755793]
We propose a novel structural entropy-guided probabilistic coding model, named SEPC.
We incorporate the relationship between latent variables into the optimization by proposing a structural entropy regularization loss.
Experimental results across 12 natural language understanding tasks, including both classification and regression tasks, demonstrate the superior performance of SEPC.
arXiv Detail & Related papers (2024-12-12T00:37:53Z) - Nonparametric Partial Disentanglement via Mechanism Sparsity: Sparse
Actions, Interventions and Sparse Temporal Dependencies [58.179981892921056]
This work introduces a novel principle for disentanglement we call mechanism sparsity regularization.
We propose a representation learning method that induces disentanglement by simultaneously learning the latent factors.
We show that the latent factors can be recovered by regularizing the learned causal graph to be sparse.
arXiv Detail & Related papers (2024-01-10T02:38:21Z) - Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Recursive Bayesian Networks: Generalising and Unifying Probabilistic
Context-Free Grammars and Dynamic Bayesian Networks [0.0]
Probabilistic context-free grammars (PCFGs) and dynamic Bayesian networks (DBNs) are widely used sequence models with complementary strengths and limitations.
We present Recursive Bayesian Networks (RBNs), which generalise and unify PCFGs and DBNs, combining their strengths and containing both as special cases.
arXiv Detail & Related papers (2021-11-02T19:21:15Z) - NOTMAD: Estimating Bayesian Networks with Sample-Specific Structures and
Parameters [70.55488722439239]
We present NOTMAD, which learns to mix archetypal networks according to sample context.
We demonstrate the utility of NOTMAD and sample-specific network inference through analysis and experiments, including patient-specific gene expression networks.
arXiv Detail & Related papers (2021-11-01T17:17:34Z) - Implicit Generative Copulas [0.0]
We propose a flexible, yet conceptually simple alternative based on implicit generative neural networks.
Experiments on synthetic and real data from finance, physics, and image generation demonstrate the performance of this approach.
arXiv Detail & Related papers (2021-09-29T17:05:30Z) - Staged trees and asymmetry-labeled DAGs [2.66269503676104]
We introduce a minimal Bayesian network representation of the staged tree, which can be used to read conditional independences in an intuitive way.
We also define a new labeled graph, termed asymmetry-labeled directed acyclic graph, whose edges are labeled to denote the type of dependence existing between any two random variables.
arXiv Detail & Related papers (2021-08-04T12:20:47Z) - Continuous-Time Bayesian Networks with Clocks [33.774970857450086]
We introduce a set of node-wise clocks to construct a collection of graph-coupled semi-Markov chains.
We provide algorithms for parameter and structure inference, which make use of local dependencies.
arXiv Detail & Related papers (2020-07-01T09:33:39Z) - GANs with Conditional Independence Graphs: On Subadditivity of
Probability Divergences [70.30467057209405]
Generative Adversarial Networks (GANs) are modern methods to learn the underlying distribution of a data set.
GANs are designed in a model-free fashion where no additional information about the underlying distribution is available.
We propose a principled design of a model-based GAN that uses a set of simple discriminators on the neighborhoods of the Bayes-net/MRF.
arXiv Detail & Related papers (2020-03-02T04:31:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.