DLDL: Dynamic Label Dictionary Learning via Hypergraph Regularization
- URL: http://arxiv.org/abs/2010.12417v1
- Date: Fri, 23 Oct 2020 14:07:07 GMT
- Title: DLDL: Dynamic Label Dictionary Learning via Hypergraph Regularization
- Authors: Shuai Shao and Mengke Wang and Rui Xu and Yan-Jiang Wang and Bao-Di
Liu
- Abstract summary: We propose a Dynamic Label Dictionary Learning (DLDL) algorithm to generate the soft label matrix for unlabeled data.
Specifically, we employ hypergraph manifold regularization to keep the relations among original data, transformed data, and soft labels consistent.
- Score: 17.34373273007931
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: For classification tasks, dictionary learning based methods have attracted
lots of attention in recent years. One popular way to achieve this purpose is
to introduce label information to generate a discriminative dictionary to
represent samples. However, compared with traditional dictionary learning, this
category of methods only achieves significant improvements in supervised
learning, and has little positive influence on semi-supervised or unsupervised
learning. To tackle this issue, we propose a Dynamic Label Dictionary Learning
(DLDL) algorithm to generate the soft label matrix for unlabeled data.
Specifically, we employ hypergraph manifold regularization to keep the
relations among original data, transformed data, and soft labels consistent. We
demonstrate the efficiency of the proposed DLDL approach on two remote sensing
datasets.
Related papers
- Co-training for Low Resource Scientific Natural Language Inference [65.37685198688538]
We propose a novel co-training method that assigns weights based on the training dynamics of the classifiers to the distantly supervised labels.
By assigning importance weights instead of filtering out examples based on an arbitrary threshold on the predicted confidence, we maximize the usage of automatically labeled data.
The proposed method obtains an improvement of 1.5% in Macro F1 over the distant supervision baseline, and substantial improvements over several other strong SSL baselines.
arXiv Detail & Related papers (2024-06-20T18:35:47Z) - Description-Enhanced Label Embedding Contrastive Learning for Text
Classification [65.01077813330559]
Self-Supervised Learning (SSL) in model learning process and design a novel self-supervised Relation of Relation (R2) classification task.
Relation of Relation Learning Network (R2-Net) for text classification, in which text classification and R2 classification are treated as optimization targets.
external knowledge from WordNet to obtain multi-aspect descriptions for label semantic learning.
arXiv Detail & Related papers (2023-06-15T02:19:34Z) - Exploring Structured Semantic Prior for Multi Label Recognition with
Incomplete Labels [60.675714333081466]
Multi-label recognition (MLR) with incomplete labels is very challenging.
Recent works strive to explore the image-to-label correspondence in the vision-language model, ie, CLIP, to compensate for insufficient annotations.
We advocate remedying the deficiency of label supervision for the MLR with incomplete labels by deriving a structured semantic prior.
arXiv Detail & Related papers (2023-03-23T12:39:20Z) - SSDL: Self-Supervised Dictionary Learning [20.925371262076744]
We propose a Self-Supervised Dictionary Learning (SSDL) framework to address this challenge.
Specifically, we first design a $p$-Laplacian Attention Hypergraph Learning block as the pretext task to generate pseudo soft labels for DL.
Then, we adopt the pseudo labels to train a dictionary from a primary label-embedded DL method.
arXiv Detail & Related papers (2021-12-03T08:55:08Z) - Discriminative Dictionary Learning based on Statistical Methods [0.0]
Sparse Representation (SR) of signals or data has a well founded theory with rigorous mathematical error bounds and proofs.
Training dictionaries such that they represent each class of signals with minimal loss is called Dictionary Learning (DL)
MOD and K-SVD have been successfully used in reconstruction based applications in image processing like image "denoising", "inpainting"
arXiv Detail & Related papers (2021-11-17T10:45:10Z) - Deep Semantic Dictionary Learning for Multi-label Image Classification [3.3989824361632337]
We present an innovative path towards the solution of the multi-label image classification which considers it as a dictionary learning task.
A novel end-to-end model named Deep Semantic Dictionary Learning (DSDL) is designed.
Our codes and models have been released.
arXiv Detail & Related papers (2020-12-23T06:22:47Z) - R$^2$-Net: Relation of Relation Learning Network for Sentence Semantic
Matching [58.72111690643359]
We propose a Relation of Relation Learning Network (R2-Net) for sentence semantic matching.
We first employ BERT to encode the input sentences from a global perspective.
Then a CNN-based encoder is designed to capture keywords and phrase information from a local perspective.
To fully leverage labels for better relation information extraction, we introduce a self-supervised relation of relation classification task.
arXiv Detail & Related papers (2020-12-16T13:11:30Z) - SLADE: A Self-Training Framework For Distance Metric Learning [75.54078592084217]
We present a self-training framework, SLADE, to improve retrieval performance by leveraging additional unlabeled data.
We first train a teacher model on the labeled data and use it to generate pseudo labels for the unlabeled data.
We then train a student model on both labels and pseudo labels to generate final feature embeddings.
arXiv Detail & Related papers (2020-11-20T08:26:10Z) - PseudoSeg: Designing Pseudo Labels for Semantic Segmentation [78.35515004654553]
We present a re-design of pseudo-labeling to generate structured pseudo labels for training with unlabeled or weakly-labeled data.
We demonstrate the effectiveness of the proposed pseudo-labeling strategy in both low-data and high-data regimes.
arXiv Detail & Related papers (2020-10-19T17:59:30Z) - Semi-supervised dictionary learning with graph regularization and active
points [0.19947949439280027]
We propose a new semi-supervised dictionary learning method based on two pillars.
On one hand, we enforce manifold structure preservation from the original data into sparse code space using Locally Linear Embedding.
On the other hand, we train a semi-supervised classifier in sparse code space.
arXiv Detail & Related papers (2020-09-13T09:24:51Z) - Learning efficient structured dictionary for image classification [11.45863364570225]
We present an efficient structured dictionary learning (ESDL) method which takes both the diversity and label information of training samples into account.
Experimental results on benchmark databases show that ESDL outperforms previous dictionary learning approaches.
More importantly, ESDL can be applied in a wide range of pattern classification tasks.
arXiv Detail & Related papers (2020-02-09T03:12:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.