NP-Match: When Neural Processes meet Semi-Supervised Learning
- URL: http://arxiv.org/abs/2207.01066v1
- Date: Sun, 3 Jul 2022 15:24:31 GMT
- Title: NP-Match: When Neural Processes meet Semi-Supervised Learning
- Authors: Jianfeng Wang, Thomas Lukasiewicz, Daniela Massiceti, Xiaolin Hu,
Vladimir Pavlovic, Alexandros Neophytou
- Abstract summary: Semi-supervised learning (SSL) has been widely explored in recent years, and it is an effective way of leveraging unlabeled data to reduce the reliance on labeled data.
In this work, we adjust neural processes (NPs) to the semi-supervised image classification task, resulting in a new method named NP-Match.
- Score: 133.009621275051
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Semi-supervised learning (SSL) has been widely explored in recent years, and
it is an effective way of leveraging unlabeled data to reduce the reliance on
labeled data. In this work, we adjust neural processes (NPs) to the
semi-supervised image classification task, resulting in a new method named
NP-Match. NP-Match is suited to this task for two reasons. Firstly, NP-Match
implicitly compares data points when making predictions, and as a result, the
prediction of each unlabeled data point is affected by the labeled data points
that are similar to it, which improves the quality of pseudo-labels. Secondly,
NP-Match is able to estimate uncertainty that can be used as a tool for
selecting unlabeled samples with reliable pseudo-labels. Compared with
uncertainty-based SSL methods implemented with Monte Carlo (MC) dropout,
NP-Match estimates uncertainty with much less computational overhead, which can
save time at both the training and the testing phases. We conducted extensive
experiments on four public datasets, and NP-Match outperforms state-of-the-art
(SOTA) results or achieves competitive results on them, which shows the
effectiveness of NP-Match and its potential for SSL.
Related papers
- SCOMatch: Alleviating Overtrusting in Open-set Semi-supervised Learning [25.508200663171625]
Open-set semi-supervised learning (OSSL) uses practical open-set unlabeled data.
Prior OSSL methods suffer from the tendency to overtrust the labeled ID data.
We propose SCOMatch, a novel OSSL that treats OOD samples as an additional class, forming a new SSL process.
arXiv Detail & Related papers (2024-09-26T03:47:34Z) - A Channel-ensemble Approach: Unbiased and Low-variance Pseudo-labels is Critical for Semi-supervised Classification [61.473485511491795]
Semi-supervised learning (SSL) is a practical challenge in computer vision.
Pseudo-label (PL) methods, e.g., FixMatch and FreeMatch, obtain the State Of The Art (SOTA) performances in SSL.
We propose a lightweight channel-based ensemble method to consolidate multiple inferior PLs into the theoretically guaranteed unbiased and low-variance one.
arXiv Detail & Related papers (2024-03-27T09:49:37Z) - KD-FixMatch: Knowledge Distillation Siamese Neural Networks [13.678635878305247]
KD-FixMatch is a novel SSL algorithm that addresses the limitations of FixMatch by incorporating knowledge distillation.
The algorithm utilizes a combination of sequential and simultaneous training of SNNs to enhance performance and reduce performance degradation.
Our results indicate that KD-FixMatch has a better training starting point that leads to improved model performance compared to FixMatch.
arXiv Detail & Related papers (2023-09-11T21:11:48Z) - NP-SemiSeg: When Neural Processes meet Semi-Supervised Semantic
Segmentation [87.50830107535533]
Semi-supervised semantic segmentation involves assigning pixel-wise labels to unlabeled images at training time.
Current approaches to semi-supervised semantic segmentation work by predicting pseudo-labels for each pixel from a class-wise probability distribution output by a model.
In this work, we move one step forward by adapting NPs to semi-supervised semantic segmentation, resulting in a new model called NP-SemiSeg.
arXiv Detail & Related papers (2023-08-05T12:42:15Z) - NP-Match: Towards a New Probabilistic Model for Semi-Supervised Learning [86.60013228560452]
Semi-supervised learning (SSL) has been widely explored in recent years, and it is an effective way of leveraging unlabeled data.
In this work, we adjust neural processes (NPs) to the semi-supervised image classification task, resulting in a new method named NP-Match.
NP-Match implicitly compares data points when making predictions, and as a result, the prediction of each unlabeled data point is affected by the labeled data points.
arXiv Detail & Related papers (2023-01-31T11:44:45Z) - Dash: Semi-Supervised Learning with Dynamic Thresholding [72.74339790209531]
We propose a semi-supervised learning (SSL) approach that uses unlabeled examples to train models.
Our proposed approach, Dash, enjoys its adaptivity in terms of unlabeled data selection.
arXiv Detail & Related papers (2021-09-01T23:52:29Z) - Matching Distributions via Optimal Transport for Semi-Supervised
Learning [31.533832244923843]
Semi-Supervised Learning (SSL) approaches have been an influential framework for the usage of unlabeled data.
We propose a new approach that adopts an Optimal Transport (OT) technique serving as a metric of similarity between discrete empirical probability measures.
We have evaluated our proposed method with state-of-the-art SSL algorithms on standard datasets to demonstrate the superiority and effectiveness of our SSL algorithm.
arXiv Detail & Related papers (2020-12-04T11:15:14Z) - Bootstrapping Neural Processes [114.97111530885093]
Neural Processes (NPs) implicitly define a broad class of processes with neural networks.
NPs still rely on an assumption that uncertainty in processes is modeled by a single latent variable.
We propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap.
arXiv Detail & Related papers (2020-08-07T02:23:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.