Trustworthy Multimodal Regression with Mixture of Normal-inverse Gamma
Distributions
- URL: http://arxiv.org/abs/2111.08456v1
- Date: Thu, 11 Nov 2021 14:28:12 GMT
- Title: Trustworthy Multimodal Regression with Mixture of Normal-inverse Gamma
Distributions
- Authors: Huan Ma, Zongbo Han, Changqing Zhang, Huazhu Fu, Joey Tianyi Zhou,
Qinghua Hu
- Abstract summary: We introduce a novel Mixture of Normal-Inverse Gamma distributions (MoNIG) algorithm, which efficiently estimates uncertainty in principle for adaptive integration of different modalities and produces a trustworthy regression result.
Experimental results on both synthetic and different real-world data demonstrate the effectiveness and trustworthiness of our method on various multimodal regression tasks.
- Score: 91.63716984911278
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multimodal regression is a fundamental task, which integrates the information
from different sources to improve the performance of follow-up applications.
However, existing methods mainly focus on improving the performance and often
ignore the confidence of prediction for diverse situations. In this study, we
are devoted to trustworthy multimodal regression which is critical in
cost-sensitive domains. To this end, we introduce a novel Mixture of
Normal-Inverse Gamma distributions (MoNIG) algorithm, which efficiently
estimates uncertainty in principle for adaptive integration of different
modalities and produces a trustworthy regression result. Our model can be
dynamically aware of uncertainty for each modality, and also robust for
corrupted modalities. Furthermore, the proposed MoNIG ensures explicitly
representation of (modality-specific/global) epistemic and aleatoric
uncertainties, respectively. Experimental results on both synthetic and
different real-world data demonstrate the effectiveness and trustworthiness of
our method on various multimodal regression tasks (e.g., temperature prediction
for superconductivity, relative location prediction for CT slices, and
multimodal sentiment analysis).
Related papers
- Calibrated Multivariate Regression with Localized PIT Mappings [4.277516034244117]
This paper introduces a novel post-hoc recalibration approach that addresses multivariate calibration for potentially misspecified models.
We present two versions of our approach: one uses K-nearest neighbors, and the other uses normalizing flows.
We demonstrate the effectiveness of our approach on two real data applications: recalibrating a deep neural network's currency exchange rate forecast and improving a regression model for childhood malnutrition in India.
arXiv Detail & Related papers (2024-09-17T02:41:03Z) - Multivariate Stochastic Dominance via Optimal Transport and Applications to Models Benchmarking [21.23500484100963]
We introduce a statistic that assesses almost dominance under the framework of Optimal Transport with a smooth cost.
We also propose a hypothesis testing framework as well as an efficient implementation using the Sinkhorn algorithm.
We showcase our method in comparing and benchmarking Large Language Models that are evaluated on multiple metrics.
arXiv Detail & Related papers (2024-06-10T16:14:50Z) - Confidence-aware multi-modality learning for eye disease screening [58.861421804458395]
We propose a novel multi-modality evidential fusion pipeline for eye disease screening.
It provides a measure of confidence for each modality and elegantly integrates the multi-modality information.
Experimental results on both public and internal datasets demonstrate that our model excels in robustness.
arXiv Detail & Related papers (2024-05-28T13:27:30Z) - Towards Understanding Variants of Invariant Risk Minimization through the Lens of Calibration [0.6906005491572401]
We show that Information Bottleneck-based IRM achieves consistent calibration across different environments.
Our empirical evidence indicates that models exhibiting consistent calibration across environments are also well-calibrated.
arXiv Detail & Related papers (2024-01-31T02:08:43Z) - The Risk of Federated Learning to Skew Fine-Tuning Features and
Underperform Out-of-Distribution Robustness [50.52507648690234]
Federated learning has the risk of skewing fine-tuning features and compromising the robustness of the model.
We introduce three robustness indicators and conduct experiments across diverse robust datasets.
Our approach markedly enhances the robustness across diverse scenarios, encompassing various parameter-efficient fine-tuning methods.
arXiv Detail & Related papers (2024-01-25T09:18:51Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - Integrating Large Pre-trained Models into Multimodal Named Entity
Recognition with Evidential Fusion [31.234455370113075]
We propose incorporating uncertainty estimation into the MNER task, producing trustworthy predictions.
Our proposed algorithm models the distribution of each modality as a Normal-inverse Gamma distribution, and fuses them into a unified distribution.
Experiments on two datasets demonstrate that our proposed method outperforms the baselines and achieves new state-of-the-art performance.
arXiv Detail & Related papers (2023-06-29T14:50:23Z) - Trusted Multi-View Classification with Dynamic Evidential Fusion [73.35990456162745]
We propose a novel multi-view classification algorithm, termed trusted multi-view classification (TMC)
TMC provides a new paradigm for multi-view learning by dynamically integrating different views at an evidence level.
Both theoretical and experimental results validate the effectiveness of the proposed model in accuracy, robustness and trustworthiness.
arXiv Detail & Related papers (2022-04-25T03:48:49Z) - A Unified Framework for Multi-distribution Density Ratio Estimation [101.67420298343512]
Binary density ratio estimation (DRE) provides the foundation for many state-of-the-art machine learning algorithms.
We develop a general framework from the perspective of Bregman minimization divergence.
We show that our framework leads to methods that strictly generalize their counterparts in binary DRE.
arXiv Detail & Related papers (2021-12-07T01:23:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.