Statistical aspects of nuclear mass models
- URL: http://arxiv.org/abs/2002.04151v3
- Date: Thu, 7 May 2020 00:39:02 GMT
- Title: Statistical aspects of nuclear mass models
- Authors: Vojtech Kejzlar, L\'eo Neufcourt, Witold Nazarewicz, Paul-Gerhard
Reinhard
- Abstract summary: We study the information content of nuclear masses from the perspective of global models of nuclear binding energies.
We employ a number of statistical methods and diagnostic tools, including Bayesian calibration, Bayesian model averaging, chi-square correlation analysis, principal component analysis, and empirical coverage probability.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the information content of nuclear masses from the perspective of
global models of nuclear binding energies. To this end, we employ a number of
statistical methods and diagnostic tools, including Bayesian calibration,
Bayesian model averaging, chi-square correlation analysis, principal component
analysis, and empirical coverage probability. Using a Bayesian framework, we
investigate the structure of the 4-parameter Liquid Drop Model by considering
discrepant mass domains for calibration. We then use the chi-square correlation
framework to analyze the 14-parameter Skyrme energy density functional
calibrated using homogeneous and heterogeneous datasets. We show that a quite
dramatic parameter reduction can be achieved in both cases. The advantage of
Bayesian model averaging for improving uncertainty quantification is
demonstrated. The statistical approaches used are pedagogically described; in
this context this work can serve as a guide for future applications.
Related papers
- Model-free Methods for Event History Analysis and Efficient Adjustment (PhD Thesis) [55.2480439325792]
This thesis is a series of independent contributions to statistics unified by a model-free perspective.
The first chapter elaborates on how a model-free perspective can be used to formulate flexible methods that leverage prediction techniques from machine learning.
The second chapter studies the concept of local independence, which describes whether the evolution of one process is directly influenced by another.
arXiv Detail & Related papers (2025-02-11T19:24:09Z) - Adaptive Nonparametric Perturbations of Parametric Bayesian Models [33.85958872117418]
We study nonparametrically perturbed parametric (NPP) Bayesian models, in which a parametric Bayesian model is relaxed via a distortion of its likelihood.
We show that NPP models can offer the robustness of non models while retaining the data efficiency of parametric models.
We demonstrate our method by estimating causal effects of gene expression from single cell RNA sequencing data.
arXiv Detail & Related papers (2024-12-14T05:06:38Z) - Local Bayesian Dirichlet mixing of imperfect models [0.0]
We study the ability of Bayesian model averaging and mixing techniques to mine nuclear masses.
We show that the global and local mixtures of models reach excellent performance on both prediction accuracy and uncertainty quantification.
arXiv Detail & Related papers (2023-11-02T21:02:40Z) - On the Properties and Estimation of Pointwise Mutual Information Profiles [49.877314063833296]
The pointwise mutual information profile, or simply profile, is the distribution of pointwise mutual information for a given pair of random variables.
We introduce a novel family of distributions, Bend and Mix Models, for which the profile can be accurately estimated using Monte Carlo methods.
arXiv Detail & Related papers (2023-10-16T10:02:24Z) - Confidence and Dispersity Speak: Characterising Prediction Matrix for
Unsupervised Accuracy Estimation [51.809741427975105]
This work aims to assess how well a model performs under distribution shifts without using labels.
We use the nuclear norm that has been shown to be effective in characterizing both properties.
We show that the nuclear norm is more accurate and robust in accuracy than existing methods.
arXiv Detail & Related papers (2023-02-02T13:30:48Z) - A Bayesian Framework on Asymmetric Mixture of Factor Analyser [0.0]
This paper introduces an MFA model with a rich and flexible class of skew normal (unrestricted) generalized hyperbolic (called SUNGH) distributions.
The SUNGH family provides considerable flexibility to model skewness in different directions as well as allowing for heavy tailed data.
Considering factor analysis models, the SUNGH family also allows for skewness and heavy tails for both the error component and factor scores.
arXiv Detail & Related papers (2022-11-01T20:19:52Z) - Bayesian model calibration for block copolymer self-assembly:
Likelihood-free inference and expected information gain computation via
measure transport [6.496038875667294]
We consider the calibration of models describing the phenomenon of block copolymer (BCP) self-assembly.
We tackle this challenging Bayesian inference problem using a likelihood-free approach based on measure transport.
We present a numerical case study based on the Ohta--Kawasaki model for diblock copolymer thin film self-assembly.
arXiv Detail & Related papers (2022-06-22T19:38:52Z) - Evaluating Sensitivity to the Stick-Breaking Prior in Bayesian
Nonparametrics [85.31247588089686]
We show that variational Bayesian methods can yield sensitivities with respect to parametric and nonparametric aspects of Bayesian models.
We provide both theoretical and empirical support for our variational approach to Bayesian sensitivity analysis.
arXiv Detail & Related papers (2021-07-08T03:40:18Z) - How Faithful is your Synthetic Data? Sample-level Metrics for Evaluating
and Auditing Generative Models [95.8037674226622]
We introduce a 3-dimensional evaluation metric that characterizes the fidelity, diversity and generalization performance of any generative model in a domain-agnostic fashion.
Our metric unifies statistical divergence measures with precision-recall analysis, enabling sample- and distribution-level diagnoses of model fidelity and diversity.
arXiv Detail & Related papers (2021-02-17T18:25:30Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Referenced Thermodynamic Integration for Bayesian Model Selection:
Application to COVID-19 Model Selection [1.9599274203282302]
We show how to compute the ratio of two models' normalising constants, known as the Bayes factor.
In this paper we apply a variation of the TI method, referred to as referenced TI, which computes a single model's normalising constant in an efficient way.
The approach is shown to be useful in practice when applied to a real problem - to perform model selection for a semi-mechanistic hierarchical Bayesian model of COVID-19 transmission in South Korea.
arXiv Detail & Related papers (2020-09-08T16:32:06Z) - Bayesian Sparse Factor Analysis with Kernelized Observations [67.60224656603823]
Multi-view problems can be faced with latent variable models.
High-dimensionality and non-linear issues are traditionally handled by kernel methods.
We propose merging both approaches into single model.
arXiv Detail & Related papers (2020-06-01T14:25:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.