Uncertainty-Aware Few-Shot Image Classification
- URL: http://arxiv.org/abs/2010.04525v2
- Date: Thu, 3 Jun 2021 13:48:34 GMT
- Title: Uncertainty-Aware Few-Shot Image Classification
- Authors: Zhizheng Zhang, Cuiling Lan, Wenjun Zeng, Zhibo Chen, Shih-Fu Chang
- Abstract summary: Few-shot image classification learns to recognize new categories from limited labelled data.
We propose Uncertainty-Aware Few-Shot framework for image classification.
- Score: 118.72423376789062
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Few-shot image classification learns to recognize new categories from limited
labelled data. Metric learning based approaches have been widely investigated,
where a query sample is classified by finding the nearest prototype from the
support set based on their feature similarities. A neural network has different
uncertainties on its calculated similarities of different pairs. Understanding
and modeling the uncertainty on the similarity could promote the exploitation
of limited samples in few-shot optimization. In this work, we propose
Uncertainty-Aware Few-Shot framework for image classification by modeling
uncertainty of the similarities of query-support pairs and performing
uncertainty-aware optimization. Particularly, we exploit such uncertainty by
converting observed similarities to probabilistic representations and
incorporate them to the loss for more effective optimization. In order to
jointly consider the similarities between a query and the prototypes in a
support set, a graph-based model is utilized to estimate the uncertainty of the
pairs. Extensive experiments show our proposed method brings significant
improvements on top of a strong baseline and achieves the state-of-the-art
performance.
Related papers
- Bayesian Hierarchical Models for Counterfactual Estimation [12.159830463756341]
We propose a probabilistic paradigm to estimate a diverse set of counterfactuals.
We treat the perturbations as random variables endowed with prior distribution functions.
A gradient based sampler with superior convergence characteristics efficiently computes the posterior samples.
arXiv Detail & Related papers (2023-01-21T00:21:11Z) - Composed Image Retrieval with Text Feedback via Multi-grained
Uncertainty Regularization [73.04187954213471]
We introduce a unified learning approach to simultaneously modeling the coarse- and fine-grained retrieval.
The proposed method has achieved +4.03%, +3.38%, and +2.40% Recall@50 accuracy over a strong baseline.
arXiv Detail & Related papers (2022-11-14T14:25:40Z) - Uncertainty-based Network for Few-shot Image Classification [17.912365063048263]
We propose Uncertainty-Based Network, which models the uncertainty of classification results with the help of mutual information.
We show that Uncertainty-Based Network achieves comparable performance in classification accuracy compared to state-of-the-art method.
arXiv Detail & Related papers (2022-05-17T07:49:32Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Path Integrals for the Attribution of Model Uncertainties [0.18899300124593643]
We present a novel algorithm that relies on in-distribution curves connecting a feature vector to some counterfactual counterpart.
We validate our approach on benchmark image data sets with varying resolution, and show that it significantly simplifies interpretability.
arXiv Detail & Related papers (2021-07-19T11:07:34Z) - Robust, Accurate Stochastic Optimization for Variational Inference [68.83746081733464]
We show that common optimization methods lead to poor variational approximations if the problem is moderately large.
Motivated by these findings, we develop a more robust and accurate optimization framework by viewing the underlying algorithm as producing a Markov chain.
arXiv Detail & Related papers (2020-09-01T19:12:11Z) - Achieving Equalized Odds by Resampling Sensitive Attributes [13.114114427206678]
We present a flexible framework for learning predictive models that approximately satisfy the equalized odds notion of fairness.
This differentiable functional is used as a penalty driving the model parameters towards equalized odds.
We develop a formal hypothesis test to detect whether a prediction rule violates this property, the first such test in the literature.
arXiv Detail & Related papers (2020-06-08T00:18:34Z) - Efficient Ensemble Model Generation for Uncertainty Estimation with
Bayesian Approximation in Segmentation [74.06904875527556]
We propose a generic and efficient segmentation framework to construct ensemble segmentation models.
In the proposed method, ensemble models can be efficiently generated by using the layer selection method.
We also devise a new pixel-wise uncertainty loss, which improves the predictive performance.
arXiv Detail & Related papers (2020-05-21T16:08:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.