BaMANI: Bayesian Multi-Algorithm causal Network Inference
- URL: http://arxiv.org/abs/2508.11741v1
- Date: Fri, 15 Aug 2025 17:38:51 GMT
- Title: BaMANI: Bayesian Multi-Algorithm causal Network Inference
- Authors: Habibolla Latifizadeh, Anika C. Pirkey, Alanna Gould, David J. Klinke II,
- Abstract summary: We develop an ensemble learning approach to marginalize the impact of a single algorithm on causal network inference.<n>We present a comprehensive implementation of the framework in terms of a new software tool called BaMANI.<n>We describe a BaMANI use-case from biology, particularly within human breast cancer studies.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Improved computational power has enabled different disciplines to predict causal relationships among modeled variables using Bayesian network inference. While many alternative algorithms have been proposed to improve the efficiency and reliability of network prediction, the predicted causal networks reflect the generative process but also bear an opaque imprint of the specific computational algorithm used. Following a ``wisdom of the crowds" strategy, we developed an ensemble learning approach to marginalize the impact of a single algorithm on Bayesian causal network inference. To introduce the approach, we first present the theoretical foundation of this framework. Next, we present a comprehensive implementation of the framework in terms of a new software tool called BaMANI (Bayesian Multi-Algorithm causal Network Inference). Finally, we describe a BaMANI use-case from biology, particularly within human breast cancer studies.
Related papers
- Faster Predictive Coding Networks via Better Initialization [52.419343840654186]
We propose a new technique for predictive coding networks that aims to preserve the iterative progress made on previous training samples.<n>Our experiments demonstrate substantial improvements in convergence speed and final test loss in both supervised and unsupervised settings.
arXiv Detail & Related papers (2026-01-28T08:52:19Z) - The Cascaded Forward Algorithm for Neural Network Training [61.06444586991505]
We propose a new learning framework for neural networks, namely Cascaded Forward (CaFo) algorithm, which does not rely on BP optimization as that in FF.
Unlike FF, our framework directly outputs label distributions at each cascaded block, which does not require generation of additional negative samples.
In our framework each block can be trained independently, so it can be easily deployed into parallel acceleration systems.
arXiv Detail & Related papers (2023-03-17T02:01:11Z) - Bayesian Learning for Neural Networks: an algorithmic survey [95.42181254494287]
This self-contained survey engages and introduces readers to the principles and algorithms of Bayesian Learning for Neural Networks.
It provides an introduction to the topic from an accessible, practical-algorithmic perspective.
arXiv Detail & Related papers (2022-11-21T21:36:58Z) - Scalable computation of prediction intervals for neural networks via
matrix sketching [79.44177623781043]
Existing algorithms for uncertainty estimation require modifying the model architecture and training procedure.
This work proposes a new algorithm that can be applied to a given trained neural network and produces approximate prediction intervals.
arXiv Detail & Related papers (2022-05-06T13:18:31Z) - Reconstructing Bayesian Networks on a Quantum Annealer [0.0]
O'Gorman et al. have proposed an algorithm to encode this task, but they have not provided an experimental evaluation of it.
We present (i) an implementation in Python of O'Gorman's algorithm, (ii) a divide et impera approach that allows addressing BNSL problems of larger sizes.
Results have shown the effectiveness of O'Gorman's formulation for BNSL instances of small sizes, and the superiority of the divide et impera approach on the direct execution of O'Gorman's algorithm.
arXiv Detail & Related papers (2022-04-07T15:53:05Z) - Quantum Approximate Optimization Algorithm for Bayesian network
structure learning [1.332091725929965]
In this work, a specific type of variational quantum algorithm, the quantum approximate optimization algorithm, was used to solve the Bayesian network structure learning problem.
Results showed that the quantum approximate optimization algorithm approach offers competitive results with state-of-the-art methods and quantitative resilience to quantum noise.
arXiv Detail & Related papers (2022-03-04T16:11:34Z) - Robustification of Online Graph Exploration Methods [59.50307752165016]
We study a learning-augmented variant of the classical, notoriously hard online graph exploration problem.
We propose an algorithm that naturally integrates predictions into the well-known Nearest Neighbor (NN) algorithm.
arXiv Detail & Related papers (2021-12-10T10:02:31Z) - Deep learning via message passing algorithms based on belief propagation [2.931240348160871]
We present a family of BP-based message-passing algorithms with a reinforcement field that biases towards locally entropic distributions.
These algorithms are capable of training multi-layer neural networks with discrete weights and activations with performance comparable to SGD-inspired solutions.
arXiv Detail & Related papers (2021-10-27T16:52:26Z) - MODEL: Motif-based Deep Feature Learning for Link Prediction [23.12527010960999]
We propose a novel embedding algorithm that incorporates network motifs to capture higher-order structures in the network.
Experiments were conducted on three types of networks: social networks, biological networks, and academic networks.
Our algorithm outperforms both the traditional similarity-based algorithms by 20% and the state-of-the-art embedding-based algorithms by 19%.
arXiv Detail & Related papers (2020-08-09T03:39:28Z) - A Constraint-Based Algorithm for the Structural Learning of
Continuous-Time Bayesian Networks [70.88503833248159]
We propose the first constraint-based algorithm for learning the structure of continuous-time Bayesian networks.
We discuss the different statistical tests and the underlying hypotheses used by our proposal to establish conditional independence.
arXiv Detail & Related papers (2020-07-07T07:34:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.