A Discrete Perspective Towards the Construction of Sparse Probabilistic Boolean Networks
- URL: http://arxiv.org/abs/2407.11543v1
- Date: Tue, 16 Jul 2024 09:50:04 GMT
- Title: A Discrete Perspective Towards the Construction of Sparse Probabilistic Boolean Networks
- Authors: Christopher H. Fok, Chi-Wing Wong, Wai-Ki Ching,
- Abstract summary: We propose a novel Greedy Entry Removal (GER) algorithm for constructing sparse PBNs.
GER gives the best performance among state-of-the-art sparse PBN construction algorithms.
- Score: 3.807361298718093
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Boolean Network (BN) and its extension Probabilistic Boolean Network (PBN) are popular mathematical models for studying genetic regulatory networks. BNs and PBNs are also applied to model manufacturing systems, financial risk and healthcare service systems. In this paper, we propose a novel Greedy Entry Removal (GER) algorithm for constructing sparse PBNs. We derive theoretical upper bounds for both existing algorithms and the GER algorithm. Furthermore, we are the first to study the lower bound problem of the construction of sparse PBNs, and to derive a series of related theoretical results. In our numerical experiments based on both synthetic and practical data, GER gives the best performance among state-of-the-art sparse PBN construction algorithms and outputs sparsest possible decompositions on most of the transition probability matrices being tested.
Related papers
- Improving Tree Probability Estimation with Stochastic Optimization and Variance Reduction [11.417249588622926]
Subsplit Bayesian networks (SBNs) provide a powerful probabilistic graphical model for tree probability estimation.
The expectation (EM) method currently used for learning SBN parameters does not scale up to large data sets.
We introduce several computationally efficient methods for training SBNs and show that variance reduction could be the key for better performance.
arXiv Detail & Related papers (2024-09-09T02:22:52Z) - Entropy and the Kullback-Leibler Divergence for Bayesian Networks:
Computational Complexity and Efficient Implementation [0.0]
We show how to compute Shannon's entropy and the Kullback-Leibler divergence for BNs under their most common distributional assumptions.
We show it is possible to reduce the computational complexity of KL from cubic to quadratic for Gaussian BNs.
arXiv Detail & Related papers (2023-11-29T15:51:04Z) - Testing Sparsity Assumptions in Bayesian Networks [0.0]
This paper provides the properties of, and a debiasing procedure for, a hypothesis test that may be used to determine if the BN has max in-degree greater than 1.
A linear BN structure discovery workflow is suggested in which the investigator uses this hypothesis test to aid in selecting an appropriate structure discovery algorithm.
arXiv Detail & Related papers (2023-07-12T18:46:22Z) - Robust Training and Verification of Implicit Neural Networks: A
Non-Euclidean Contractive Approach [64.23331120621118]
This paper proposes a theoretical and computational framework for training and robustness verification of implicit neural networks.
We introduce a related embedded network and show that the embedded network can be used to provide an $ell_infty$-norm box over-approximation of the reachable sets of the original network.
We apply our algorithms to train implicit neural networks on the MNIST dataset and compare the robustness of our models with the models trained via existing approaches in the literature.
arXiv Detail & Related papers (2022-08-08T03:13:24Z) - Using the Projected Belief Network at High Dimensions [13.554038901140949]
The projected belief network (PBN) is a layered generative network (LGN) with tractable likelihood function.
We apply the discriminatively aligned PBN to classifying and auto-encoding high-dimensional spectrograms of acoustic events.
arXiv Detail & Related papers (2022-04-25T19:54:52Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Improved Acyclicity Reasoning for Bayesian Network Structure Learning
with Constraint Programming [0.0]
Learning the structure of a Bayesian network (BNSL) from discrete data is known to be an NP-hard task.
In this work, we propose a new time algorithm for discovering a subset of all possible cluster cuts.
We show that, despite being suboptimal, they improve performance by orders of magnitude.
The resulting solver also compares favourably with GOBNILP, a state-of-the-art solver for the BNSL problem.
arXiv Detail & Related papers (2021-06-23T09:46:11Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Improved Branch and Bound for Neural Network Verification via Lagrangian
Decomposition [161.09660864941603]
We improve the scalability of Branch and Bound (BaB) algorithms for formally proving input-output properties of neural networks.
We present a novel activation-based branching strategy and a BaB framework, named Branch and Dual Network Bound (BaDNB)
BaDNB outperforms previous complete verification systems by a large margin, cutting average verification times by factors up to 50 on adversarial properties.
arXiv Detail & Related papers (2021-04-14T09:22:42Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Probabilistic Generating Circuits [50.98473654244851]
We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
arXiv Detail & Related papers (2021-02-19T07:06:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.