Conditional Contrastive Learning with Kernel
- URL: http://arxiv.org/abs/2202.05458v2
- Date: Mon, 14 Feb 2022 17:19:42 GMT
- Title: Conditional Contrastive Learning with Kernel
- Authors: Yao-Hung Hubert Tsai, Tianqin Li, Martin Q. Ma, Han Zhao, Kun Zhang,
Louis-Philippe Morency, Ruslan Salakhutdinov
- Abstract summary: Conditional Contrastive Learning with Kernel (CCL-K)
This paper presents Conditional Contrastive Learning with Kernel that converts existing conditional contrastive objectives into alternative forms that mitigate the insufficient data problem.
We conduct experiments using weakly supervised, fair, and hard negatives contrastive learning, showing CCL-K outperforms state-of-the-art baselines.
- Score: 107.5989144369343
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Conditional contrastive learning frameworks consider the conditional sampling
procedure that constructs positive or negative data pairs conditioned on
specific variables. Fair contrastive learning constructs negative pairs, for
example, from the same gender (conditioning on sensitive information), which in
turn reduces undesirable information from the learned representations; weakly
supervised contrastive learning constructs positive pairs with similar
annotative attributes (conditioning on auxiliary information), which in turn
are incorporated into the representations. Although conditional contrastive
learning enables many applications, the conditional sampling procedure can be
challenging if we cannot obtain sufficient data pairs for some values of the
conditioning variable. This paper presents Conditional Contrastive Learning
with Kernel (CCL-K) that converts existing conditional contrastive objectives
into alternative forms that mitigate the insufficient data problem. Instead of
sampling data according to the value of the conditioning variable, CCL-K uses
the Kernel Conditional Embedding Operator that samples data from all available
data and assigns weights to each sampled data given the kernel similarity
between the values of the conditioning variable. We conduct experiments using
weakly supervised, fair, and hard negatives contrastive learning, showing CCL-K
outperforms state-of-the-art baselines.
Related papers
- ReCaLL: Membership Inference via Relative Conditional Log-Likelihoods [56.073335779595475]
We propose ReCaLL (Relative Conditional Log-Likelihood), a novel membership inference attack (MIA)
ReCaLL examines the relative change in conditional log-likelihoods when prefixing target data points with non-member context.
We conduct comprehensive experiments and show that ReCaLL achieves state-of-the-art performance on the WikiMIA dataset.
arXiv Detail & Related papers (2024-06-23T00:23:13Z) - Don't drop your samples! Coherence-aware training benefits Conditional diffusion [17.349357521783062]
Coherence-Aware Diffusion (CAD) is a novel method that integrates coherence in conditional information into diffusion models.
We show that CAD is theoretically sound and empirically effective on various conditional generation tasks.
arXiv Detail & Related papers (2024-05-30T17:57:26Z) - Rank Supervised Contrastive Learning for Time Series Classification [17.302643963704643]
We present Rank Supervised Contrastive Learning (RankSCL) to perform time series classification.
RankSCL augments raw data in a targeted way in the embedding space.
A novel rank loss is developed to assign different weights for different levels of positive samples.
arXiv Detail & Related papers (2024-01-31T18:29:10Z) - Contrastive Learning with Negative Sampling Correction [52.990001829393506]
We propose a novel contrastive learning method named Positive-Unlabeled Contrastive Learning (PUCL)
PUCL treats the generated negative samples as unlabeled samples and uses information from positive samples to correct bias in contrastive loss.
PUCL can be applied to general contrastive learning problems and outperforms state-of-the-art methods on various image and graph classification tasks.
arXiv Detail & Related papers (2024-01-13T11:18:18Z) - Collapse by Conditioning: Training Class-conditional GANs with Limited
Data [109.30895503994687]
We propose a training strategy for conditional GANs (cGANs) that effectively prevents the observed mode-collapse by leveraging unconditional learning.
Our training strategy starts with an unconditional GAN and gradually injects conditional information into the generator and the objective function.
The proposed method for training cGANs with limited data results not only in stable training but also in generating high-quality images.
arXiv Detail & Related papers (2022-01-17T18:59:23Z) - Contrastive Attraction and Contrastive Repulsion for Representation
Learning [131.72147978462348]
Contrastive learning (CL) methods learn data representations in a self-supervision manner, where the encoder contrasts each positive sample over multiple negative samples.
Recent CL methods have achieved promising results when pretrained on large-scale datasets, such as ImageNet.
We propose a doubly CL strategy that separately compares positive and negative samples within their own groups, and then proceeds with a contrast between positive and negative groups.
arXiv Detail & Related papers (2021-05-08T17:25:08Z) - SimCSE: Simple Contrastive Learning of Sentence Embeddings [10.33373737281907]
This paper presents SimCSE, a contrastive learning framework for embeddings.
We first describe an unsupervised approach, which takes an input sentence and predicts itself in a contrastive objective.
We then incorporate annotated pairs from NLI datasets into contrastive learning by using "entailment" pairs as positives and "contradiction" pairs as hard negatives.
arXiv Detail & Related papers (2021-04-18T11:27:08Z) - Contrastive Learning with Hard Negative Samples [80.12117639845678]
We develop a new family of unsupervised sampling methods for selecting hard negative samples.
A limiting case of this sampling results in a representation that tightly clusters each class, and pushes different classes as far apart as possible.
The proposed method improves downstream performance across multiple modalities, requires only few additional lines of code to implement, and introduces no computational overhead.
arXiv Detail & Related papers (2020-10-09T14:18:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.