Hybrid Bernstein Normalizing Flows for Flexible Multivariate Density Regression with Interpretable Marginals
- URL: http://arxiv.org/abs/2505.14164v2
- Date: Thu, 12 Jun 2025 14:10:40 GMT
- Title: Hybrid Bernstein Normalizing Flows for Flexible Multivariate Density Regression with Interpretable Marginals
- Authors: Marcel Arpogaus, Thomas Kneib, Thomas Nagler, David RĂ¼gamer,
- Abstract summary: Density regression models allow a comprehensive understanding of data by modeling the complete conditional probability distribution.<n>In this paper, we combine MCTM with state-of-the-art and autoregressive NF to leverage the transparency of MCTM for modeling interpretable feature effects.<n>We demonstrate our method's versatility in various numerical experiments and compare it with MCTM and other NF models on both simulated and real-world data.
- Score: 3.669506968635671
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Density regression models allow a comprehensive understanding of data by modeling the complete conditional probability distribution. While flexible estimation approaches such as normalizing flows (NF) work particularly well in multiple dimensions, interpreting the input-output relationship of such models is often difficult, due to the black-box character of deep learning models. In contrast, existing statistical methods for multivariate outcomes such as multivariate conditional transformation models (MCTM) are restricted in flexibility and are often not expressive enough to represent complex multivariate probability distributions. In this paper, we combine MCTM with state-of-the-art and autoregressive NF to leverage the transparency of MCTM for modeling interpretable feature effects on the marginal distributions in the first step and the flexibility of neural-network-based NF techniques to account for complex and non-linear relationships in the joint data distribution. We demonstrate our method's versatility in various numerical experiments and compare it with MCTM and other NF models on both simulated and real-world data.
Related papers
- Learning Divergence Fields for Shift-Robust Graph Representations [73.11818515795761]
In this work, we propose a geometric diffusion model with learnable divergence fields for the challenging problem with interdependent data.
We derive a new learning objective through causal inference, which can guide the model to learn generalizable patterns of interdependence that are insensitive across domains.
arXiv Detail & Related papers (2024-06-07T14:29:21Z) - Beyond DAGs: A Latent Partial Causal Model for Multimodal Learning [80.44084021062105]
We propose a novel latent partial causal model for multimodal data, featuring two latent coupled variables, connected by an undirected edge, to represent the transfer of knowledge across modalities.<n>Under specific statistical assumptions, we establish an identifiability result, demonstrating that representations learned by multimodal contrastive learning correspond to the latent coupled variables up to a trivial transformation.<n>Experiments on a pre-trained CLIP model embodies disentangled representations, enabling few-shot learning and improving domain generalization across diverse real-world datasets.
arXiv Detail & Related papers (2024-02-09T07:18:06Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - FiMReSt: Finite Mixture of Multivariate Regulated Skew-t Kernels -- A
Flexible Probabilistic Model for Multi-Clustered Data with
Asymmetrically-Scattered Non-Gaussian Kernels [0.0]
We propose a regularized iterative optimization process to train the mixture model, enhancing the generalizability and power for modeling the skews.
The resulting mixture model is named Finite Mixture of Multi Regulated Skewt (FiMStss)
To validate the performance, we have conducted a comprehensive experiment on several real-world datasets and a synthetic dataset.
arXiv Detail & Related papers (2023-05-15T23:53:59Z) - Normalizing Flow with Variational Latent Representation [20.038183566389794]
We propose a new framework based on variational latent representation to improve the practical performance of Normalizing Flow (NF)
The idea is to replace the standard normal latent variable with a more general latent representation, jointly learned via Variational Bayes.
The resulting method is significantly more powerful than the standard normalization flow approach for generating data distributions with multiple modes.
arXiv Detail & Related papers (2022-11-21T16:51:49Z) - Learning Multivariate CDFs and Copulas using Tensor Factorization [39.24470798045442]
Learning the multivariate distribution of data is a core challenge in statistics and machine learning.
In this work, we aim to learn multivariate cumulative distribution functions (CDFs), as they can handle mixed random variables.
We show that any grid sampled version of a joint CDF of mixed random variables admits a universal representation as a naive Bayes model.
We demonstrate the superior performance of the proposed model in several synthetic and real datasets and applications including regression, sampling and data imputation.
arXiv Detail & Related papers (2022-10-13T16:18:46Z) - Mixture of experts models for multilevel data: modelling framework and
approximation theory [0.0]
We study a class of mixed MoE (MMoE) models for multilevel data.
The MMoE has a potential to accurately resemble almost all characteristics inherited in multilevel data.
A nested version of the MMoE universally approximates a broad range of dependence structures of the random effects among different factor levels.
arXiv Detail & Related papers (2022-09-30T03:29:32Z) - Trustworthy Multimodal Regression with Mixture of Normal-inverse Gamma
Distributions [91.63716984911278]
We introduce a novel Mixture of Normal-Inverse Gamma distributions (MoNIG) algorithm, which efficiently estimates uncertainty in principle for adaptive integration of different modalities and produces a trustworthy regression result.
Experimental results on both synthetic and different real-world data demonstrate the effectiveness and trustworthiness of our method on various multimodal regression tasks.
arXiv Detail & Related papers (2021-11-11T14:28:12Z) - Marginalizable Density Models [14.50261153230204]
We present a novel deep network architecture which provides closed form expressions for the probabilities, marginals and conditionals of any subset of the variables.
The model also allows for parallelized sampling with only a logarithmic dependence of the time complexity on the number of variables.
arXiv Detail & Related papers (2021-06-08T23:54:48Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.