SleepPriorCL: Contrastive Representation Learning with Prior
Knowledge-based Positive Mining and Adaptive Temperature for Sleep Staging
- URL: http://arxiv.org/abs/2110.09966v1
- Date: Fri, 15 Oct 2021 06:54:29 GMT
- Title: SleepPriorCL: Contrastive Representation Learning with Prior
Knowledge-based Positive Mining and Adaptive Temperature for Sleep Staging
- Authors: Hongjun Zhang, Jing Wang, Qinfeng Xiao, Jiaoxue Deng, Youfang Lin
- Abstract summary: Self-supervised learning (SSL) based on contrasting semantically similar (positive) and dissimilar (negative) pairs of samples have achieved promising success.
Existing SSL methods suffer the problem that many semantically similar positives are still uncovered and even treated as negatives.
In this paper, we propose a novel SSL approach named SleepPriorCL to alleviate the above problem.
- Score: 9.102084407643199
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The objective of this paper is to learn semantic representations for sleep
stage classification from raw physiological time series. Although supervised
methods have gained remarkable performance, they are limited in clinical
situations due to the requirement of fully labeled data. Self-supervised
learning (SSL) based on contrasting semantically similar (positive) and
dissimilar (negative) pairs of samples have achieved promising success.
However, existing SSL methods suffer the problem that many semantically similar
positives are still uncovered and even treated as negatives. In this paper, we
propose a novel SSL approach named SleepPriorCL to alleviate the above problem.
Advances of our approach over existing SSL methods are two-fold: 1) by
incorporating prior domain knowledge into the training regime of SSL, more
semantically similar positives are discovered without accessing ground-truth
labels; 2) via investigating the influence of the temperature in contrastive
loss, an adaptive temperature mechanism for each sample according to prior
domain knowledge is further proposed, leading to better performance. Extensive
experiments demonstrate that our method achieves state-of-the-art performance
and consistently outperforms baselines.
Related papers
- ItTakesTwo: Leveraging Peer Representations for Semi-supervised LiDAR Semantic Segmentation [24.743048965822297]
This paper introduces a novel semi-supervised LiDAR semantic segmentation framework called ItTakesTwo (IT2)
IT2 is designed to ensure consistent predictions from peer LiDAR representations, thereby improving the perturbation effectiveness in consistency learning.
Results on public benchmarks show that our approach achieves remarkable improvements over the previous state-of-the-art (SOTA) methods in the field.
arXiv Detail & Related papers (2024-07-09T18:26:53Z) - A Channel-ensemble Approach: Unbiased and Low-variance Pseudo-labels is Critical for Semi-supervised Classification [61.473485511491795]
Semi-supervised learning (SSL) is a practical challenge in computer vision.
Pseudo-label (PL) methods, e.g., FixMatch and FreeMatch, obtain the State Of The Art (SOTA) performances in SSL.
We propose a lightweight channel-based ensemble method to consolidate multiple inferior PLs into the theoretically guaranteed unbiased and low-variance one.
arXiv Detail & Related papers (2024-03-27T09:49:37Z) - SPLAL: Similarity-based pseudo-labeling with alignment loss for
semi-supervised medical image classification [11.435826510575879]
Semi-supervised learning (SSL) methods can mitigate challenges by leveraging both labeled and unlabeled data.
SSL methods for medical image classification need to address two key challenges: (1) estimating reliable pseudo-labels for the images in the unlabeled dataset and (2) reducing biases caused by class imbalance.
In this paper, we propose a novel SSL approach, SPLAL, that effectively addresses these challenges.
arXiv Detail & Related papers (2023-07-10T14:53:24Z) - Benchmarking Self-Supervised Learning on Diverse Pathology Datasets [10.868779327544688]
Self-supervised learning has shown to be an effective method for utilizing unlabeled data.
We execute the largest-scale study of SSL pre-training on pathology image data.
For the first time, we apply SSL to the challenging task of nuclei instance segmentation.
arXiv Detail & Related papers (2022-12-09T06:38:34Z) - MaxMatch: Semi-Supervised Learning with Worst-Case Consistency [149.03760479533855]
We propose a worst-case consistency regularization technique for semi-supervised learning (SSL)
We present a generalization bound for SSL consisting of the empirical loss terms observed on labeled and unlabeled training data separately.
Motivated by this bound, we derive an SSL objective that minimizes the largest inconsistency between an original unlabeled sample and its multiple augmented variants.
arXiv Detail & Related papers (2022-09-26T12:04:49Z) - Hierarchical Semi-Supervised Contrastive Learning for
Contamination-Resistant Anomaly Detection [81.07346419422605]
Anomaly detection aims at identifying deviant samples from the normal data distribution.
Contrastive learning has provided a successful way to sample representation that enables effective discrimination on anomalies.
We propose a novel hierarchical semi-supervised contrastive learning framework, for contamination-resistant anomaly detection.
arXiv Detail & Related papers (2022-07-24T18:49:26Z) - Collaborative Intelligence Orchestration: Inconsistency-Based Fusion of
Semi-Supervised Learning and Active Learning [60.26659373318915]
Active learning (AL) and semi-supervised learning (SSL) are two effective, but often isolated, means to alleviate the data-hungry problem.
We propose an innovative Inconsistency-based virtual aDvErial algorithm to further investigate SSL-AL's potential superiority.
Two real-world case studies visualize the practical industrial value of applying and deploying the proposed data sampling algorithm.
arXiv Detail & Related papers (2022-06-07T13:28:43Z) - Trash to Treasure: Harvesting OOD Data with Cross-Modal Matching for
Open-Set Semi-Supervised Learning [101.28281124670647]
Open-set semi-supervised learning (open-set SSL) investigates a challenging but practical scenario where out-of-distribution (OOD) samples are contained in the unlabeled data.
We propose a novel training mechanism that could effectively exploit the presence of OOD data for enhanced feature learning.
Our approach substantially lifts the performance on open-set SSL and outperforms the state-of-the-art by a large margin.
arXiv Detail & Related papers (2021-08-12T09:14:44Z) - A Realistic Evaluation of Semi-Supervised Learning for Fine-Grained
Classification [38.68079253627819]
Our benchmark consists of two fine-grained classification datasets obtained by sampling classes from the Aves and Fungi taxonomy.
We find that recently proposed SSL methods provide significant benefits, and can effectively use out-of-class data to improve performance when deep networks are trained from scratch.
Our work suggests that semi-supervised learning with experts on realistic datasets may require different strategies than those currently prevalent in the literature.
arXiv Detail & Related papers (2021-04-01T17:59:41Z) - On Data-Augmentation and Consistency-Based Semi-Supervised Learning [77.57285768500225]
Recently proposed consistency-based Semi-Supervised Learning (SSL) methods have advanced the state of the art in several SSL tasks.
Despite these advances, the understanding of these methods is still relatively limited.
arXiv Detail & Related papers (2021-01-18T10:12:31Z) - Semi-supervised learning objectives as log-likelihoods in a generative
model of data curation [32.45282187405337]
We formulate SSL objectives as a log-likelihood in a generative model of data curation.
We give a proof-of-principle for Bayesian SSL on toy data.
arXiv Detail & Related papers (2020-08-13T13:50:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.