Rank Supervised Contrastive Learning for Time Series Classification
- URL: http://arxiv.org/abs/2401.18057v1
- Date: Wed, 31 Jan 2024 18:29:10 GMT
- Title: Rank Supervised Contrastive Learning for Time Series Classification
- Authors: Qianying Ren, Dongsheng Luo, Dongjin Song
- Abstract summary: We present Rank Supervised Contrastive Learning (RankSCL) to perform time series classification.
RankSCL augments raw data in a targeted way in the embedding space.
A novel rank loss is developed to assign different weights for different levels of positive samples.
- Score: 19.446437832981545
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, various contrastive learning techniques have been developed to
categorize time series data and exhibit promising performance. A general
paradigm is to utilize appropriate augmentations and construct feasible
positive samples such that the encoder can yield robust and discriminative
representations by mapping similar data points closer together in the feature
space while pushing dissimilar data points farther apart. Despite its efficacy,
the fine-grained relative similarity (e.g., rank) information of positive
samples is largely ignored, especially when labeled samples are limited. To
this end, we present Rank Supervised Contrastive Learning (RankSCL) to perform
time series classification. Different from conventional contrastive learning
frameworks, RankSCL augments raw data in a targeted way in the embedding space
and adopts certain filtering rules to select more informative positive and
negative pairs of samples. Moreover, a novel rank loss is developed to assign
different weights for different levels of positive samples, enable the encoder
to extract the fine-grained information of the same class, and produce a clear
boundary among different classes. Thoroughly empirical studies on 128 UCR
datasets and 30 UEA datasets demonstrate that the proposed RankSCL can achieve
state-of-the-art performance compared to existing baseline methods.
Related papers
- Decoupled Contrastive Learning for Long-Tailed Recognition [58.255966442426484]
Supervised Contrastive Loss (SCL) is popular in visual representation learning.
In the scenario of long-tailed recognition, where the number of samples in each class is imbalanced, treating two types of positive samples equally leads to the biased optimization for intra-category distance.
We propose a patch-based self distillation to transfer knowledge from head to tail classes to relieve the under-representation of tail classes.
arXiv Detail & Related papers (2024-03-10T09:46:28Z) - Supervised Stochastic Neighbor Embedding Using Contrastive Learning [4.560284382063488]
Clusters of samples belonging to the same class are pulled together in low-dimensional embedding space.
We extend the self-supervised contrastive approach to the fully-supervised setting, allowing us to effectively leverage label information.
arXiv Detail & Related papers (2023-09-15T00:26:21Z) - Hodge-Aware Contrastive Learning [101.56637264703058]
Simplicial complexes prove effective in modeling data with multiway dependencies.
We develop a contrastive self-supervised learning approach for processing simplicial data.
arXiv Detail & Related papers (2023-09-14T00:40:07Z) - Time Series Contrastive Learning with Information-Aware Augmentations [57.45139904366001]
A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples.
How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question.
We propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning.
arXiv Detail & Related papers (2023-03-21T15:02:50Z) - Intra-class Adaptive Augmentation with Neighbor Correction for Deep
Metric Learning [99.14132861655223]
We propose a novel intra-class adaptive augmentation (IAA) framework for deep metric learning.
We reasonably estimate intra-class variations for every class and generate adaptive synthetic samples to support hard samples mining.
Our method significantly improves and outperforms the state-of-the-art methods on retrieval performances by 3%-6%.
arXiv Detail & Related papers (2022-11-29T14:52:38Z) - Few-shot Object Detection with Refined Contrastive Learning [4.520231308678286]
We propose a novel few-shot object detection (FSOD) method with Refined Contrastive Learning (FSRC)
A pre-determination component is introduced to find out the Resemblance Group from novel classes which contains confusable classes.
RCL is pointedly performed on this group of classes in order to increase the inter-class distances among them.
arXiv Detail & Related papers (2022-11-24T09:34:20Z) - Hierarchical Semi-Supervised Contrastive Learning for
Contamination-Resistant Anomaly Detection [81.07346419422605]
Anomaly detection aims at identifying deviant samples from the normal data distribution.
Contrastive learning has provided a successful way to sample representation that enables effective discrimination on anomalies.
We propose a novel hierarchical semi-supervised contrastive learning framework, for contamination-resistant anomaly detection.
arXiv Detail & Related papers (2022-07-24T18:49:26Z) - Tackling Online One-Class Incremental Learning by Removing Negative
Contrasts [12.048166025000976]
Distinct from other continual learning settings the learner is presented new samples only once.
ER-AML achieved strong performance in this setting by applying an asymmetric loss based on contrastive learning to the incoming data and replayed data.
We adapt a recently proposed approach from self-supervised learning to the supervised learning setting, unlocking the constraint on contrasts.
arXiv Detail & Related papers (2022-03-24T19:17:29Z) - No Fear of Heterogeneity: Classifier Calibration for Federated Learning
with Non-IID Data [78.69828864672978]
A central challenge in training classification models in the real-world federated system is learning with non-IID data.
We propose a novel and simple algorithm called Virtual Representations (CCVR), which adjusts the classifier using virtual representations sampled from an approximated ssian mixture model.
Experimental results demonstrate that CCVR state-of-the-art performance on popular federated learning benchmarks including CIFAR-10, CIFAR-100, and CINIC-10.
arXiv Detail & Related papers (2021-06-09T12:02:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.