Logarithm-transform aided Gaussian Sampling for Few-Shot Learning
- URL: http://arxiv.org/abs/2309.16337v1
- Date: Thu, 28 Sep 2023 10:50:32 GMT
- Title: Logarithm-transform aided Gaussian Sampling for Few-Shot Learning
- Authors: Vaibhav Ganatra
- Abstract summary: I propose a novel Gaussian transform, that outperforms existing methods on transforming experimental data into Gaussian-like distributions.
I then utilise this novel transformation for few-shot image classification and show significant gains in performance, while sampling lesser data.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Few-shot image classification has recently witnessed the rise of
representation learning being utilised for models to adapt to new classes using
only a few training examples. Therefore, the properties of the representations,
such as their underlying probability distributions, assume vital importance.
Representations sampled from Gaussian distributions have been used in recent
works, [19] to train classifiers for few-shot classification. These methods
rely on transforming the distributions of experimental data to approximate
Gaussian distributions for their functioning. In this paper, I propose a novel
Gaussian transform, that outperforms existing methods on transforming
experimental data into Gaussian-like distributions. I then utilise this novel
transformation for few-shot image classification and show significant gains in
performance, while sampling lesser data.
Related papers
- Conditional Distribution Modelling for Few-Shot Image Synthesis with Diffusion Models [29.821909424996015]
Few-shot image synthesis entails generating diverse and realistic images of novel categories using only a few example images.
We propose Conditional Distribution Modelling (CDM) -- a framework which effectively utilizes Diffusion models for few-shot image generation.
arXiv Detail & Related papers (2024-04-25T12:11:28Z) - Simple and effective data augmentation for compositional generalization [64.00420578048855]
We show that data augmentation methods that sample MRs and backtranslate them can be effective for compositional generalization.
Remarkably, sampling from a uniform distribution performs almost as well as sampling from the test distribution.
arXiv Detail & Related papers (2024-01-18T09:13:59Z) - Trajectory-aware Principal Manifold Framework for Data Augmentation and
Image Generation [5.31812036803692]
Many existing methods generate new samples from a parametric distribution, like the Gaussian, with little attention to generate samples along the data manifold in either the input or feature space.
We propose a novel trajectory-aware principal manifold framework to restore the manifold backbone and generate samples along a specific trajectory.
We show that the novel framework is able to extract more compact manifold representation, improve classification accuracy and generate smooth transformation among few samples.
arXiv Detail & Related papers (2023-07-30T07:31:38Z) - Compound Batch Normalization for Long-tailed Image Classification [77.42829178064807]
We propose a compound batch normalization method based on a Gaussian mixture.
It can model the feature space more comprehensively and reduce the dominance of head classes.
The proposed method outperforms existing methods on long-tailed image classification.
arXiv Detail & Related papers (2022-12-02T07:31:39Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Free Lunch for Few-shot Learning: Distribution Calibration [10.474018806591397]
We show that a simple logistic regression classifier trained using the features sampled from our calibrated distribution can outperform the state-of-the-art accuracy on two datasets.
arXiv Detail & Related papers (2021-01-16T07:58:40Z) - Embedding Propagation: Smoother Manifold for Few-Shot Classification [131.81692677836202]
We propose to use embedding propagation as an unsupervised non-parametric regularizer for manifold smoothing in few-shot classification.
We empirically show that embedding propagation yields a smoother embedding manifold.
We show that embedding propagation consistently improves the accuracy of the models in multiple semi-supervised learning scenarios by up to 16% points.
arXiv Detail & Related papers (2020-03-09T13:51:09Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.