Bayesian Neural Networks with Domain Knowledge Priors
- URL: http://arxiv.org/abs/2402.13410v1
- Date: Tue, 20 Feb 2024 22:34:53 GMT
- Title: Bayesian Neural Networks with Domain Knowledge Priors
- Authors: Dylan Sam, Rattana Pukdee, Daniel P. Jeong, Yewon Byun, J. Zico Kolter
- Abstract summary: We propose a framework for integrating general forms of domain knowledge into a BNN prior.
We show that BNNs using our proposed domain knowledge priors outperform those with standard priors.
- Score: 52.80929437592308
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian neural networks (BNNs) have recently gained popularity due to their
ability to quantify model uncertainty. However, specifying a prior for BNNs
that captures relevant domain knowledge is often extremely challenging. In this
work, we propose a framework for integrating general forms of domain knowledge
(i.e., any knowledge that can be represented by a loss function) into a BNN
prior through variational inference, while enabling computationally efficient
posterior inference and sampling. Specifically, our approach results in a prior
over neural network weights that assigns high probability mass to models that
better align with our domain knowledge, leading to posterior samples that also
exhibit this behavior. We show that BNNs using our proposed domain knowledge
priors outperform those with standard priors (e.g., isotropic Gaussian,
Gaussian process), successfully incorporating diverse types of prior
information such as fairness, physics rules, and healthcare knowledge and
achieving better predictive performance. We also present techniques for
transferring the learned priors across different model architectures,
demonstrating their broad utility across various settings.
Related papers
- Bridged-GNN: Knowledge Bridge Learning for Effective Knowledge Transfer [65.42096702428347]
Graph Neural Networks (GNNs) aggregate information from neighboring nodes.
Knowledge Bridge Learning (KBL) learns a knowledge-enhanced posterior distribution for target domains.
Bridged-GNN includes an Adaptive Knowledge Retrieval module to build Bridged-Graph and a Graph Knowledge Transfer module.
arXiv Detail & Related papers (2023-08-18T12:14:51Z) - Sparsifying Bayesian neural networks with latent binary variables and
normalizing flows [10.865434331546126]
We will consider two extensions to the latent binary Bayesian neural networks (LBBNN) method.
Firstly, by using the local reparametrization trick (LRT) to sample the hidden units directly, we get a more computationally efficient algorithm.
More importantly, by using normalizing flows on the variational posterior distribution of the LBBNN parameters, the network learns a more flexible variational posterior distribution than the mean field Gaussian.
arXiv Detail & Related papers (2023-05-05T09:40:28Z) - Incorporating Unlabelled Data into Bayesian Neural Networks [48.25555899636015]
We introduce Self-Supervised Bayesian Neural Networks, which use unlabelled data to learn models with suitable prior predictive distributions.
We show that the prior predictive distributions of self-supervised BNNs capture problem semantics better than conventional BNN priors.
Our approach offers improved predictive performance over conventional BNNs, especially in low-budget regimes.
arXiv Detail & Related papers (2023-04-04T12:51:35Z) - Posterior Regularized Bayesian Neural Network Incorporating Soft and
Hard Knowledge Constraints [12.050265348673078]
We propose a novel Posterior-Regularized Bayesian Neural Network (PR-BNN) model by incorporating different types of knowledge constraints.
Experiments in simulation and two case studies about aviation landing prediction and solar energy output prediction have shown the knowledge constraints and the performance improvement of the proposed model.
arXiv Detail & Related papers (2022-10-16T18:58:50Z) - Look beyond labels: Incorporating functional summary information in
Bayesian neural networks [11.874130244353253]
We present a simple approach to incorporate summary information about the predicted probability.
The available summary information is incorporated as augmented data and modeled with a Dirichlet process.
We show how the method can inform the model about task difficulty or class imbalance.
arXiv Detail & Related papers (2022-07-04T07:06:45Z) - Knowledge Enhanced Neural Networks for relational domains [83.9217787335878]
We focus on a specific method, KENN, a Neural-Symbolic architecture that injects prior logical knowledge into a neural network.
In this paper, we propose an extension of KENN for relational data.
arXiv Detail & Related papers (2022-05-31T13:00:34Z) - Spatial-Temporal-Fusion BNN: Variational Bayesian Feature Layer [77.78479877473899]
We design a spatial-temporal-fusion BNN for efficiently scaling BNNs to large models.
Compared to vanilla BNNs, our approach can greatly reduce the training time and the number of parameters, which contributes to scale BNNs efficiently.
arXiv Detail & Related papers (2021-12-12T17:13:14Z) - BNNpriors: A library for Bayesian neural network inference with
different prior distributions [32.944046414823916]
BNNpriors enables state-of-the-art Markov Chain Monte Carlo inference on Bayesian neural networks.
It follows a modular approach that eases the design and implementation of new custom priors.
It has facilitated foundational discoveries on the nature of the cold posterior effect in Bayesian neural networks.
arXiv Detail & Related papers (2021-05-14T17:11:04Z) - Neural Networks Enhancement with Logical Knowledge [83.9217787335878]
We propose an extension of KENN for relational data.
The results show that KENN is capable of increasing the performances of the underlying neural network even in the presence relational data.
arXiv Detail & Related papers (2020-09-13T21:12:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.