Beyond Unimodal: Generalising Neural Processes for Multimodal
Uncertainty Estimation
- URL: http://arxiv.org/abs/2304.01518v2
- Date: Mon, 23 Oct 2023 02:06:05 GMT
- Title: Beyond Unimodal: Generalising Neural Processes for Multimodal
Uncertainty Estimation
- Authors: Myong Chol Jung, He Zhao, Joanna Dipnall, Lan Du
- Abstract summary: Uncertainty estimation is an important research area to make deep neural networks (DNNs) more trustworthy.
We propose Multimodal Neural Processes (MNPs) by generalising NPs for multimodal uncertainty estimation.
Based on the framework of NPs, MNPs consist of several novel and principled mechanisms tailored to the characteristics of multimodal data.
- Score: 8.208132494639763
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Uncertainty estimation is an important research area to make deep neural
networks (DNNs) more trustworthy. While extensive research on uncertainty
estimation has been conducted with unimodal data, uncertainty estimation for
multimodal data remains a challenge. Neural processes (NPs) have been
demonstrated to be an effective uncertainty estimation method for unimodal data
by providing the reliability of Gaussian processes with efficient and powerful
DNNs. While NPs hold significant potential for multimodal uncertainty
estimation, the adaptation of NPs for multimodal data has not been carefully
studied. To bridge this gap, we propose Multimodal Neural Processes (MNPs) by
generalising NPs for multimodal uncertainty estimation. Based on the framework
of NPs, MNPs consist of several novel and principled mechanisms tailored to the
characteristics of multimodal data. In extensive empirical evaluation, our
method achieves state-of-the-art multimodal uncertainty estimation performance,
showing its appealing robustness against noisy samples and reliability in
out-of-distribution detection with faster computation time compared to the
current state-of-the-art multimodal uncertainty estimation method.
Related papers
- One step closer to unbiased aleatoric uncertainty estimation [71.55174353766289]
We propose a new estimation method by actively de-noising the observed data.
By conducting a broad range of experiments, we demonstrate that our proposed approach provides a much closer approximation to the actual data uncertainty than the standard method.
arXiv Detail & Related papers (2023-12-16T14:59:11Z) - Uncertainty in Natural Language Processing: Sources, Quantification, and
Applications [56.130945359053776]
We provide a comprehensive review of uncertainty-relevant works in the NLP field.
We first categorize the sources of uncertainty in natural language into three types, including input, system, and output.
We discuss the challenges of uncertainty estimation in NLP and discuss potential future directions.
arXiv Detail & Related papers (2023-06-05T06:46:53Z) - Uncertainty Estimation by Fisher Information-based Evidential Deep
Learning [61.94125052118442]
Uncertainty estimation is a key factor that makes deep learning reliable in practical applications.
We propose a novel method, Fisher Information-based Evidential Deep Learning ($mathcalI$-EDL)
In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes.
arXiv Detail & Related papers (2023-03-03T16:12:59Z) - Uncertainty-aware Multi-modal Learning via Cross-modal Random Network
Prediction [22.786774541083652]
We propose a new Uncertainty-aware Multi-modal Learner that estimates uncertainty by measuring feature density via Cross-modal Random Network Prediction (CRNP)
CRNP is designed to require little adaptation to translate between different prediction tasks, while having a stable training process.
arXiv Detail & Related papers (2022-07-22T03:00:10Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Trustworthy Multimodal Regression with Mixture of Normal-inverse Gamma
Distributions [91.63716984911278]
We introduce a novel Mixture of Normal-Inverse Gamma distributions (MoNIG) algorithm, which efficiently estimates uncertainty in principle for adaptive integration of different modalities and produces a trustworthy regression result.
Experimental results on both synthetic and different real-world data demonstrate the effectiveness and trustworthiness of our method on various multimodal regression tasks.
arXiv Detail & Related papers (2021-11-11T14:28:12Z) - Interval Deep Learning for Uncertainty Quantification in Safety
Applications [0.0]
Current deep neural networks (DNNs) do not have an implicit mechanism to quantify and propagate significant input data uncertainty.
We present a DNN optimized with gradient-based methods capable to quantify input and parameter uncertainty by means of interval analysis.
We show that the Deep Interval Neural Network (DINN) can produce accurate bounded estimates from uncertain input data.
arXiv Detail & Related papers (2021-05-13T17:21:33Z) - Multivariate Density Estimation with Deep Neural Mixture Models [0.0]
Deep neural networks (DNNs) have seldom been applied to density estimation.
This paper extends our previous work on Neural Mixture Densities (NMMs)
A maximum-likelihood (ML) algorithm for estimating Deep NMMs (DNMMs) is handed out.
The class of probability density functions that can be modeled to any degree of precision via DNMMs is formally defined.
arXiv Detail & Related papers (2020-12-06T23:03:48Z) - Multi-Loss Sub-Ensembles for Accurate Classification with Uncertainty
Estimation [1.2891210250935146]
We propose an efficient method for uncertainty estimation in deep neural networks (DNNs) achieving high accuracy.
We keep our inference time relatively low by leveraging the advantage proposed by the Deep-Sub-Ensembles method.
Our results show improved accuracy on the classification task and competitive results on several uncertainty measures.
arXiv Detail & Related papers (2020-10-05T10:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.