Learning efficient structured dictionary for image classification
- URL: http://arxiv.org/abs/2002.03271v2
- Date: Fri, 8 May 2020 01:58:51 GMT
- Title: Learning efficient structured dictionary for image classification
- Authors: Zi-Qi Li, Jun Sun, Xiao-Jun Wu and He-Feng Yin
- Abstract summary: We present an efficient structured dictionary learning (ESDL) method which takes both the diversity and label information of training samples into account.
Experimental results on benchmark databases show that ESDL outperforms previous dictionary learning approaches.
More importantly, ESDL can be applied in a wide range of pattern classification tasks.
- Score: 11.45863364570225
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent years have witnessed the success of dictionary learning (DL) based
approaches in the domain of pattern classification. In this paper, we present
an efficient structured dictionary learning (ESDL) method which takes both the
diversity and label information of training samples into account. Specifically,
ESDL introduces alternative training samples into the process of dictionary
learning. To increase the discriminative capability of representation
coefficients for classification, an ideal regularization term is incorporated
into the objective function of ESDL. Moreover, in contrast with conventional DL
approaches which impose computationally expensive L1-norm constraint on the
coefficient matrix, ESDL employs L2-norm regularization term. Experimental
results on benchmark databases (including four face databases and one scene
dataset) demonstrate that ESDL outperforms previous DL approaches. More
importantly, ESDL can be applied in a wide range of pattern classification
tasks.
Related papers
- Pairwise Difference Learning for Classification [19.221081896134567]
Pairwise difference learning (PDL) has recently been introduced as a new meta-learning technique for regression.
We extend PDL toward the task of classification by solving a suitably defined (binary) classification problem on a paired version of the original training data.
We provide an easy-to-use and publicly available implementation of PDL in a Python package.
arXiv Detail & Related papers (2024-06-28T16:20:22Z) - Co-training for Low Resource Scientific Natural Language Inference [65.37685198688538]
We propose a novel co-training method that assigns weights based on the training dynamics of the classifiers to the distantly supervised labels.
By assigning importance weights instead of filtering out examples based on an arbitrary threshold on the predicted confidence, we maximize the usage of automatically labeled data.
The proposed method obtains an improvement of 1.5% in Macro F1 over the distant supervision baseline, and substantial improvements over several other strong SSL baselines.
arXiv Detail & Related papers (2024-06-20T18:35:47Z) - RAR: Retrieving And Ranking Augmented MLLMs for Visual Recognition [78.97487780589574]
Multimodal Large Language Models (MLLMs) excel at classifying fine-grained categories.
This paper introduces a Retrieving And Ranking augmented method for MLLMs.
Our proposed approach not only addresses the inherent limitations in fine-grained recognition but also preserves the model's comprehensive knowledge base.
arXiv Detail & Related papers (2024-03-20T17:59:55Z) - Deep Dictionary Learning with An Intra-class Constraint [23.679645826983503]
We propose a novel deep dictionary learning model with an intra-class constraint (DDLIC) for visual classification.
Specifically, we design the intra-class compactness constraint on the intermediate representation at different levels to encourage the intra-class representations to be closer to each other.
Unlike the traditional DDL methods, during the classification stage, our DDLIC performs a layer-wise greedy optimization in a similar way to the training stage.
arXiv Detail & Related papers (2022-07-14T11:54:58Z) - Supervised Dictionary Learning with Auxiliary Covariates [0.0]
Supervised dictionary learning (SDL) is a machine learning method that simultaneously seeks feature extraction and classification tasks.
We provide a novel framework that lifts' SDL as a convex problem in a combined factor space.
We apply SDL for imbalanced document classification by supervised topic modeling and also for pneumonia from chest X-ray images.
arXiv Detail & Related papers (2022-06-14T12:10:03Z) - Always Keep your Target in Mind: Studying Semantics and Improving
Performance of Neural Lexical Substitution [124.99894592871385]
We present a large-scale comparative study of lexical substitution methods employing both old and most recent language models.
We show that already competitive results achieved by SOTA LMs/MLMs can be further substantially improved if information about the target word is injected properly.
arXiv Detail & Related papers (2022-06-07T16:16:19Z) - A Multi-level Supervised Contrastive Learning Framework for Low-Resource
Natural Language Inference [54.678516076366506]
Natural Language Inference (NLI) is a growingly essential task in natural language understanding.
Here we propose a multi-level supervised contrastive learning framework named MultiSCL for low-resource natural language inference.
arXiv Detail & Related papers (2022-05-31T05:54:18Z) - Better Language Model with Hypernym Class Prediction [101.8517004687825]
Class-based language models (LMs) have been long devised to address context sparsity in $n$-gram LMs.
In this study, we revisit this approach in the context of neural LMs.
arXiv Detail & Related papers (2022-03-21T01:16:44Z) - Deep Semantic Dictionary Learning for Multi-label Image Classification [3.3989824361632337]
We present an innovative path towards the solution of the multi-label image classification which considers it as a dictionary learning task.
A novel end-to-end model named Deep Semantic Dictionary Learning (DSDL) is designed.
Our codes and models have been released.
arXiv Detail & Related papers (2020-12-23T06:22:47Z) - DLDL: Dynamic Label Dictionary Learning via Hypergraph Regularization [17.34373273007931]
We propose a Dynamic Label Dictionary Learning (DLDL) algorithm to generate the soft label matrix for unlabeled data.
Specifically, we employ hypergraph manifold regularization to keep the relations among original data, transformed data, and soft labels consistent.
arXiv Detail & Related papers (2020-10-23T14:07:07Z) - A Comparative Study of Lexical Substitution Approaches based on Neural
Language Models [117.96628873753123]
We present a large-scale comparative study of popular neural language and masked language models.
We show that already competitive results achieved by SOTA LMs/MLMs can be further improved if information about the target word is injected properly.
arXiv Detail & Related papers (2020-05-29T18:43:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.