On uncertainty-penalized Bayesian information criterion
- URL: http://arxiv.org/abs/2404.16881v1
- Date: Tue, 23 Apr 2024 13:59:11 GMT
- Title: On uncertainty-penalized Bayesian information criterion
- Authors: Pongpisit Thanasutives, Ken-ichi Fukui,
- Abstract summary: We show that using the uncertainty-penalized information criterion (UBIC) is equivalent to employing the conventional BIC.
The result indicates that the property of the UBIC and BIC holds indifferently.
- Score: 1.1049608786515839
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The uncertainty-penalized information criterion (UBIC) has been proposed as a new model-selection criterion for data-driven partial differential equation (PDE) discovery. In this paper, we show that using the UBIC is equivalent to employing the conventional BIC to a set of overparameterized models derived from the potential regression models of different complexity measures. The result indicates that the asymptotic property of the UBIC and BIC holds indifferently.
Related papers
- Adaptation of uncertainty-penalized Bayesian information criterion for parametric partial differential equation discovery [1.1049608786515839]
We introduce an extension of the uncertainty-penalized Bayesian information criterion (UBIC) to solve parametric PDE discovery problems efficiently.
UBIC uses quantified PDE uncertainty over different temporal or spatial points to prevent overfitting in model selection.
We show that our extended UBIC can identify the true number of terms and their varying coefficients accurately, even in the presence of noise.
arXiv Detail & Related papers (2024-08-15T12:10:50Z) - Generalized Criterion for Identifiability of Additive Noise Models Using Majorization [7.448620208767376]
We introduce a novel identifiability criterion for directed acyclic graph (DAG) models.
We demonstrate that this criterion extends and generalizes existing identifiability criteria.
We present a new algorithm for learning a topological ordering of variables.
arXiv Detail & Related papers (2024-04-08T02:18:57Z) - Bayesian Model Selection via Mean-Field Variational Approximation [10.433170683584994]
We study the non-asymptotic properties of mean-field (MF) inference under the Bayesian framework.
We show a Bernstein von-Mises (BvM) theorem for the variational distribution from MF under possible model mis-specification.
arXiv Detail & Related papers (2023-12-17T04:48:25Z) - Selective Nonparametric Regression via Testing [54.20569354303575]
We develop an abstention procedure via testing the hypothesis on the value of the conditional variance at a given point.
Unlike existing methods, the proposed one allows to account not only for the value of the variance itself but also for the uncertainty of the corresponding variance predictor.
arXiv Detail & Related papers (2023-09-28T13:04:11Z) - Adaptive Uncertainty-Guided Model Selection for Data-Driven PDE
Discovery [3.065513003860786]
We propose a new parameter-adaptive uncertainty-penalized Bayesian information criterion (UBIC) to prioritize the parsimonious partial differential equation (PDE)
We numerically affirm the successful application of the UBIC in identifying the true governing PDE.
We reveal an interesting effect of denoising the observed data on improving the trade-off between the BIC score and model complexity.
arXiv Detail & Related papers (2023-08-20T14:36:45Z) - Gibbs-Based Information Criteria and the Over-Parameterized Regime [20.22034560278484]
Double-descent refers to the unexpected drop in test loss of a learning algorithm beyond an interpolating threshold.
We update these analyses using the information risk minimization framework and provide Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) for models learned by the Gibbs algorithm.
arXiv Detail & Related papers (2023-06-08T22:54:48Z) - Non-Linear Spectral Dimensionality Reduction Under Uncertainty [107.01839211235583]
We propose a new dimensionality reduction framework, called NGEU, which leverages uncertainty information and directly extends several traditional approaches.
We show that the proposed NGEU formulation exhibits a global closed-form solution, and we analyze, based on the Rademacher complexity, how the underlying uncertainties theoretically affect the generalization ability of the framework.
arXiv Detail & Related papers (2022-02-09T19:01:33Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Identification of Probability weighted ARX models with arbitrary domains [75.91002178647165]
PieceWise Affine models guarantees universal approximation, local linearity and equivalence to other classes of hybrid system.
In this work, we focus on the identification of PieceWise Auto Regressive with eXogenous input models with arbitrary regions (NPWARX)
The architecture is conceived following the Mixture of Expert concept, developed within the machine learning field.
arXiv Detail & Related papers (2020-09-29T12:50:33Z) - Accounting for Unobserved Confounding in Domain Generalization [107.0464488046289]
This paper investigates the problem of learning robust, generalizable prediction models from a combination of datasets.
Part of the challenge of learning robust models lies in the influence of unobserved confounders.
We demonstrate the empirical performance of our approach on healthcare data from different modalities.
arXiv Detail & Related papers (2020-07-21T08:18:06Z) - Repulsive Mixture Models of Exponential Family PCA for Clustering [127.90219303669006]
The mixture extension of exponential family principal component analysis ( EPCA) was designed to encode much more structural information about data distribution than the traditional EPCA.
The traditional mixture of local EPCAs has the problem of model redundancy, i.e., overlaps among mixing components, which may cause ambiguity for data clustering.
In this paper, a repulsiveness-encouraging prior is introduced among mixing components and a diversified EPCA mixture (DEPCAM) model is developed in the Bayesian framework.
arXiv Detail & Related papers (2020-04-07T04:07:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.