Bayesian Discrete Conditional Transformation Models
- URL: http://arxiv.org/abs/2205.08594v1
- Date: Tue, 17 May 2022 19:26:43 GMT
- Title: Bayesian Discrete Conditional Transformation Models
- Authors: Manuel Carlan and Thomas Kneib
- Abstract summary: We propose a novel Bayesian model framework for discrete ordinal and count data based on conditional transformations of the responses.
For count responses, the resulting transformation model is a Bayesian fully parametric yet distribution-free approach.
Inference is conducted by a generic modular Markov chain Monte Carlo algorithm.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel Bayesian model framework for discrete ordinal and count
data based on conditional transformations of the responses. The conditional
transformation function is estimated from the data in conjunction with an a
priori chosen reference distribution. For count responses, the resulting
transformation model is novel in the sense that it is a Bayesian fully
parametric yet distribution-free approach that can additionally account for
excess zeros with additive transformation function specifications. For ordinal
categoric responses, our cumulative link transformation model allows the
inclusion of linear and nonlinear covariate effects that can additionally be
made category-specific, resulting in (non-)proportional odds or hazards models
and more, depending on the choice of the reference distribution. Inference is
conducted by a generic modular Markov chain Monte Carlo algorithm where
multivariate Gaussian priors enforce specific properties such as smoothness on
the functional effects. To illustrate the versatility of Bayesian discrete
conditional transformation models, applications to counts of patent citations
in the presence of excess zeros and on treating forest health categories in a
discrete partial proportional odds model are presented.
Related papers
- Scaling and renormalization in high-dimensional regression [72.59731158970894]
This paper presents a succinct derivation of the training and generalization performance of a variety of high-dimensional ridge regression models.
We provide an introduction and review of recent results on these topics, aimed at readers with backgrounds in physics and deep learning.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - A probabilistic, data-driven closure model for RANS simulations with aleatoric, model uncertainty [1.8416014644193066]
We propose a data-driven, closure model for Reynolds-averaged Navier-Stokes (RANS) simulations that incorporates aleatoric, model uncertainty.
A fully Bayesian formulation is proposed, combined with a sparsity-inducing prior in order to identify regions in the problem domain where the parametric closure is insufficient.
arXiv Detail & Related papers (2023-07-05T16:53:31Z) - Approximate Message Passing for the Matrix Tensor Product Model [8.206394018475708]
We propose and analyze an approximate message passing (AMP) algorithm for the matrix tensor product model.
Building upon an convergence theorem for non-separable functions, we prove a state evolution for non-separable functions.
We leverage this state evolution result to provide necessary and sufficient conditions for recovery of the signal of interest.
arXiv Detail & Related papers (2023-06-27T16:03:56Z) - Monte Carlo inference for semiparametric Bayesian regression [5.488491124945426]
This paper introduces a simple, general, and efficient strategy for joint posterior inference of an unknown transformation and all regression model parameters.
It delivers (1) joint posterior consistency under general conditions, including multiple model misspecifications, and (2) efficient Monte Carlo (not Markov chain Monte Carlo) inference for the transformation and all parameters for important special cases.
arXiv Detail & Related papers (2023-06-08T18:42:42Z) - Multielement polynomial chaos Kriging-based metamodelling for Bayesian
inference of non-smooth systems [0.0]
This paper presents a surrogate modelling technique based on domain partitioning for Bayesian parameter inference of highly nonlinear engineering models.
The developed surrogate model combines in a piecewise function an array of local Polynomial Chaos based Kriging metamodels constructed on a finite set of non-overlapping of the input space.
The efficiency and accuracy of the proposed approach are validated through two case studies, including an analytical benchmark and a numerical case study.
arXiv Detail & Related papers (2022-12-05T13:22:39Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Transformation Models for Flexible Posteriors in Variational Bayes [0.0]
In neural networks, variational inference is widely used to approximate difficult-to-compute posteriors.
Transformation models are flexible enough to fit any distribution.
TM-VI allows to accurately approximate complex posteriors in models with one parameter.
arXiv Detail & Related papers (2021-06-01T14:43:47Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Slice Sampling for General Completely Random Measures [74.24975039689893]
We present a novel Markov chain Monte Carlo algorithm for posterior inference that adaptively sets the truncation level using auxiliary slice variables.
The efficacy of the proposed algorithm is evaluated on several popular nonparametric models.
arXiv Detail & Related papers (2020-06-24T17:53:53Z) - Feature Transformation Ensemble Model with Batch Spectral Regularization
for Cross-Domain Few-Shot Classification [66.91839845347604]
We propose an ensemble prediction model by performing diverse feature transformations after a feature extraction network.
We use a batch spectral regularization term to suppress the singular values of the feature matrix during pre-training to improve the generalization ability of the model.
The proposed model can then be fine tuned in the target domain to address few-shot classification.
arXiv Detail & Related papers (2020-05-18T05:31:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.