An Empirical Analysis of the Advantages of Finite- v.s. Infinite-Width
Bayesian Neural Networks
- URL: http://arxiv.org/abs/2211.09184v1
- Date: Wed, 16 Nov 2022 20:07:55 GMT
- Title: An Empirical Analysis of the Advantages of Finite- v.s. Infinite-Width
Bayesian Neural Networks
- Authors: Jiayu Yao, Yaniv Yacoby, Beau Coker, Weiwei Pan, Finale Doshi-Velez
- Abstract summary: We empirically compare finite- and infinite-width BNNs, and provide quantitative and qualitative explanations for their performance difference.
We find that when the model is mis-specified, increasing width can hurt BNN performance.
In these cases, we provide evidence that finite-width BNNs generalize better partially due to the properties of their frequency spectrum that allows them to adapt under model mismatch.
- Score: 25.135652514472238
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Comparing Bayesian neural networks (BNNs) with different widths is
challenging because, as the width increases, multiple model properties change
simultaneously, and, inference in the finite-width case is intractable. In this
work, we empirically compare finite- and infinite-width BNNs, and provide
quantitative and qualitative explanations for their performance difference. We
find that when the model is mis-specified, increasing width can hurt BNN
performance. In these cases, we provide evidence that finite-width BNNs
generalize better partially due to the properties of their frequency spectrum
that allows them to adapt under model mismatch.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.