Deep Metric Learning-Based Semi-Supervised Regression With Alternate
Learning
- URL: http://arxiv.org/abs/2202.11388v1
- Date: Wed, 23 Feb 2022 10:04:15 GMT
- Title: Deep Metric Learning-Based Semi-Supervised Regression With Alternate
Learning
- Authors: Adina Zell, Gencer Sumbul, Beg\"um Demir
- Abstract summary: This paper introduces a novel deep metric learning-based semi-supervised regression (DML-S2R) method for parameter estimation problems.
The proposed DML-S2R method aims to mitigate the problems of insufficient amount of labeled samples without collecting any additional samples with target values.
The experimental results confirm the success of DML-S2R compared to the state-of-the-art semi-supervised regression methods.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces a novel deep metric learning-based semi-supervised
regression (DML-S2R) method for parameter estimation problems. The proposed
DML-S2R method aims to mitigate the problems of insufficient amount of labeled
samples without collecting any additional samples with target values. To this
end, the proposed DML-S2R method is made up of two main steps: i) pairwise
similarity modeling with scarce labeled data; and ii) triplet-based metric
learning with abundant unlabeled data. The first step aims to model pairwise
sample similarities by using a small number of labeled samples. This is
achieved by estimating the target value differences of labeled samples with a
Siamese neural network (SNN). The second step aims to learn a triplet-based
metric space (in which similar samples are close to each other and dissimilar
samples are far apart from each other) when the number of labeled samples is
insufficient. This is achieved by employing the SNN of the first step for
triplet-based deep metric learning that exploits not only labeled samples but
also unlabeled samples. For the end-to-end training of DML-S2R, we investigate
an alternate learning strategy for the two steps. Due to this strategy, the
encoded information in each step becomes a guidance for learning the other
step. The experimental results confirm the success of DML-S2R compared to the
state-of-the-art semi-supervised regression methods. The code of the proposed
method is publicly available at https://git.tu-berlin.de/rsim/DML-S2R.
Related papers
- A semi-supervised learning using over-parameterized regression [0.0]
Semi-supervised learning (SSL) is an important theme in machine learning.
In this paper, we consider a method of incorporating information on unlabeled samples into kernel functions.
arXiv Detail & Related papers (2024-09-06T03:05:35Z) - Decoupled Prototype Learning for Reliable Test-Time Adaptation [50.779896759106784]
Test-time adaptation (TTA) is a task that continually adapts a pre-trained source model to the target domain during inference.
One popular approach involves fine-tuning model with cross-entropy loss according to estimated pseudo-labels.
This study reveals that minimizing the classification error of each sample causes the cross-entropy loss's vulnerability to label noise.
We propose a novel Decoupled Prototype Learning (DPL) method that features prototype-centric loss computation.
arXiv Detail & Related papers (2024-01-15T03:33:39Z) - Generative Semi-supervised Learning with Meta-Optimized Synthetic
Samples [5.384630221560811]
Semi-supervised learning (SSL) is a promising approach for training deep classification models using labeled and unlabeled datasets.
In this paper, we investigate the research question: Can we train SSL models without real unlabeled datasets?
We propose an SSL method using synthetic datasets generated from generative foundation models trained on datasets containing millions of samples in diverse domains.
arXiv Detail & Related papers (2023-09-28T03:47:26Z) - Label-Noise Learning with Intrinsically Long-Tailed Data [65.41318436799993]
We propose a learning framework for label-noise learning with intrinsically long-tailed data.
Specifically, we propose two-stage bi-dimensional sample selection (TABASCO) to better separate clean samples from noisy samples.
arXiv Detail & Related papers (2022-08-21T07:47:05Z) - Adaptive neighborhood Metric learning [184.95321334661898]
We propose a novel distance metric learning algorithm, named adaptive neighborhood metric learning (ANML)
ANML can be used to learn both the linear and deep embeddings.
The emphlog-exp mean function proposed in our method gives a new perspective to review the deep metric learning methods.
arXiv Detail & Related papers (2022-01-20T17:26:37Z) - Exploring Adversarial Robustness of Deep Metric Learning [25.12224002984514]
DML uses deep neural architectures to learn semantic embeddings of the input.
We tackle the primary challenge of the metric losses being dependent on the samples in a mini-batch.
Using experiments on three commonly-used DML datasets, we demonstrate 5-76 fold increases in adversarial accuracy.
arXiv Detail & Related papers (2021-02-14T23:18:12Z) - Attentional-Biased Stochastic Gradient Descent [74.49926199036481]
We present a provable method (named ABSGD) for addressing the data imbalance or label noise problem in deep learning.
Our method is a simple modification to momentum SGD where we assign an individual importance weight to each sample in the mini-batch.
ABSGD is flexible enough to combine with other robust losses without any additional cost.
arXiv Detail & Related papers (2020-12-13T03:41:52Z) - Deep Metric Learning Meets Deep Clustering: An Novel Unsupervised
Approach for Feature Embedding [32.8693763689033]
Unsupervised Deep Distance Metric Learning (UDML) aims to learn sample similarities in the embedding space from an unlabeled dataset.
Traditional UDML methods usually use the triplet loss or pairwise loss which requires the mining of positive and negative samples.
This is, however, challenging in an unsupervised setting as the label information is not available.
We propose a new UDML method that overcomes that challenge.
arXiv Detail & Related papers (2020-09-09T04:02:04Z) - Multi-Task Curriculum Framework for Open-Set Semi-Supervised Learning [54.85397562961903]
Semi-supervised learning (SSL) has been proposed to leverage unlabeled data for training powerful models when only limited labeled data is available.
We address a more complex novel scenario named open-set SSL, where out-of-distribution (OOD) samples are contained in unlabeled data.
Our method achieves state-of-the-art results by successfully eliminating the effect of OOD samples.
arXiv Detail & Related papers (2020-07-22T10:33:55Z) - MetricUNet: Synergistic Image- and Voxel-Level Learning for Precise CT
Prostate Segmentation via Online Sampling [66.01558025094333]
We propose a two-stage framework, with the first stage to quickly localize the prostate region and the second stage to precisely segment the prostate.
We introduce a novel online metric learning module through voxel-wise sampling in the multi-task network.
Our method can effectively learn more representative voxel-level features compared with the conventional learning methods with cross-entropy or Dice loss.
arXiv Detail & Related papers (2020-05-15T10:37:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.