Meta-learning for Out-of-Distribution Detection via Density Estimation
in Latent Space
- URL: http://arxiv.org/abs/2206.09543v1
- Date: Mon, 20 Jun 2022 02:44:42 GMT
- Title: Meta-learning for Out-of-Distribution Detection via Density Estimation
in Latent Space
- Authors: Tomoharu Iwata, Atsutoshi Kumagai
- Abstract summary: We propose a simple yet effective meta-learning method to detect OoD with small in-distribution data in a target task.
A neural network shared among all tasks is used to flexibly map instances in the original space to the latent space.
In experiments using six datasets, we demonstrate that the proposed method achieves better performance than existing meta-learning and OoD detection methods.
- Score: 40.58524521473793
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many neural network-based out-of-distribution (OoD) detection methods have
been proposed. However, they require many training data for each target task.
We propose a simple yet effective meta-learning method to detect OoD with small
in-distribution data in a target task. With the proposed method, the OoD
detection is performed by density estimation in a latent space. A neural
network shared among all tasks is used to flexibly map instances in the
original space to the latent space. The neural network is meta-learned such
that the expected OoD detection performance is improved by using various tasks
that are different from the target tasks. This meta-learning procedure enables
us to obtain appropriate representations in the latent space for OoD detection.
For density estimation, we use a Gaussian mixture model (GMM) with full
covariance for each class. We can adapt the GMM parameters to in-distribution
data in each task in a closed form by maximizing the likelihood. Since the
closed form solution is differentiable, we can meta-learn the neural network
efficiently with a stochastic gradient descent method by incorporating the
solution into the meta-learning objective function. In experiments using six
datasets, we demonstrate that the proposed method achieves better performance
than existing meta-learning and OoD detection methods.
Related papers
- Task-Distributionally Robust Data-Free Meta-Learning [99.56612787882334]
Data-Free Meta-Learning (DFML) aims to efficiently learn new tasks by leveraging multiple pre-trained models without requiring their original training data.
For the first time, we reveal two major challenges hindering their practical deployments: Task-Distribution Shift ( TDS) and Task-Distribution Corruption (TDC)
arXiv Detail & Related papers (2023-11-23T15:46:54Z) - Meta Learning Low Rank Covariance Factors for Energy-Based Deterministic
Uncertainty [58.144520501201995]
Bi-Lipschitz regularization of neural network layers preserve relative distances between data instances in the feature spaces of each layer.
With the use of an attentive set encoder, we propose to meta learn either diagonal or diagonal plus low-rank factors to efficiently construct task specific covariance matrices.
We also propose an inference procedure which utilizes scaled energy to achieve a final predictive distribution.
arXiv Detail & Related papers (2021-10-12T22:04:19Z) - Energy-Efficient and Federated Meta-Learning via Projected Stochastic
Gradient Ascent [79.58680275615752]
We propose an energy-efficient federated meta-learning framework.
We assume each task is owned by a separate agent, so a limited number of tasks is used to train a meta-model.
arXiv Detail & Related papers (2021-05-31T08:15:44Z) - Meta-learning One-class Classifiers with Eigenvalue Solvers for
Supervised Anomaly Detection [55.888835686183995]
We propose a neural network-based meta-learning method for supervised anomaly detection.
We experimentally demonstrate that the proposed method achieves better performance than existing anomaly detection and few-shot learning methods.
arXiv Detail & Related papers (2021-03-01T01:43:04Z) - Entropy Maximization and Meta Classification for Out-Of-Distribution
Detection in Semantic Segmentation [7.305019142196585]
"Out-of-distribution" (OoD) samples are crucial for many applications such as automated driving.
A natural baseline approach to OoD detection is to threshold on the pixel-wise softmax entropy.
We present a two-step procedure that significantly improves that approach.
arXiv Detail & Related papers (2020-12-09T11:01:06Z) - MetaGater: Fast Learning of Conditional Channel Gated Networks via
Federated Meta-Learning [46.79356071007187]
We propose a holistic approach to jointly train the backbone network and the channel gating.
We develop a federated meta-learning approach to jointly learn good meta-initializations for both backbone networks and gating modules.
arXiv Detail & Related papers (2020-11-25T04:26:23Z) - Task-agnostic Out-of-Distribution Detection Using Kernel Density
Estimation [10.238403787504756]
We propose a task-agnostic method to perform out-of-distribution (OOD) detection in deep neural networks (DNNs)
We estimate the probability density functions (pdfs) of intermediate features of a pre-trained DNN by performing kernel density estimation (KDE) on the training dataset.
At test time, we evaluate the pdfs on a test sample and produce a confidence score that indicates the sample is OOD.
arXiv Detail & Related papers (2020-06-18T17:46:06Z) - Diversity inducing Information Bottleneck in Model Ensembles [73.80615604822435]
In this paper, we target the problem of generating effective ensembles of neural networks by encouraging diversity in prediction.
We explicitly optimize a diversity inducing adversarial loss for learning latent variables and thereby obtain diversity in the output predictions necessary for modeling multi-modal data.
Compared to the most competitive baselines, we show significant improvements in classification accuracy, under a shift in the data distribution.
arXiv Detail & Related papers (2020-03-10T03:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.