Graph based Label Enhancement for Multi-instance Multi-label learning
- URL: http://arxiv.org/abs/2304.10705v1
- Date: Fri, 21 Apr 2023 02:24:49 GMT
- Title: Graph based Label Enhancement for Multi-instance Multi-label learning
- Authors: Houcheng Su, Jintao Huang, Daixian Liu, Rui Yan, Jiao Li, Chi-man Vong
- Abstract summary: Multi-instance multi-label (MIML) learning is widely applicated in numerous domains.
This paper proposes a novel MIML framework based on graph label enhancement, namely GLEMIML, to improve the classification performance of MIML.
- Score: 20.178466198202376
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multi-instance multi-label (MIML) learning is widely applicated in numerous
domains, such as the image classification where one image contains multiple
instances correlated with multiple logic labels simultaneously. The related
labels in existing MIML are all assumed as logical labels with equal
significance. However, in practical applications in MIML, significance of each
label for multiple instances per bag (such as an image) is significant
different. Ignoring labeling significance will greatly lose the semantic
information of the object, so that MIML is not applicable in complex scenes
with a poor learning performance. To this end, this paper proposed a novel MIML
framework based on graph label enhancement, namely GLEMIML, to improve the
classification performance of MIML by leveraging label significance. GLEMIML
first recognizes the correlations among instances by establishing the graph and
then migrates the implicit information mined from the feature space to the
label space via nonlinear mapping, thus recovering the label significance.
Finally, GLEMIML is trained on the enhanced data through matching and
interaction mechanisms. GLEMIML (AvgRank: 1.44) can effectively improve the
performance of MIML by mining the label distribution mechanism and show better
results than the SOTA method (AvgRank: 2.92) on multiple benchmark datasets.
Related papers
- Exploiting Conjugate Label Information for Multi-Instance Partial-Label Learning [61.00359941983515]
Multi-instance partial-label learning (MIPL) addresses scenarios where each training sample is represented as a multi-instance bag associated with a candidate label set containing one true label and several false positives.
ELIMIPL exploits the conjugate label information to improve the disambiguation performance.
arXiv Detail & Related papers (2024-08-26T15:49:31Z) - Scalable Label Distribution Learning for Multi-Label Classification [43.52928088881866]
Multi-label classification (MLC) refers to the problem of tagging a given instance with a set of relevant labels.
Most existing MLC methods are based on the assumption that the correlation of two labels in each label pair is symmetric.
Most existing methods design learning processes associated with the number of labels, which makes their computational complexity a bottleneck when scaling up to large-scale output space.
arXiv Detail & Related papers (2023-11-28T06:52:53Z) - Disambiguated Attention Embedding for Multi-Instance Partial-Label
Learning [68.56193228008466]
In many real-world tasks, the concerned objects can be represented as a multi-instance bag associated with a candidate label set.
Existing MIPL approach follows the instance-space paradigm by assigning augmented candidate label sets of bags to each instance and aggregating bag-level labels from instance-level labels.
We propose an intuitive algorithm named DEMIPL, i.e., Disambiguated attention Embedding for Multi-Instance Partial-Label learning.
arXiv Detail & Related papers (2023-05-26T13:25:17Z) - Complementary to Multiple Labels: A Correlation-Aware Correction
Approach [65.59584909436259]
We show theoretically how the estimated transition matrix in multi-class CLL could be distorted in multi-labeled cases.
We propose a two-step method to estimate the transition matrix from candidate labels.
arXiv Detail & Related papers (2023-02-25T04:48:48Z) - Multi-Instance Partial-Label Learning: Towards Exploiting Dual Inexact
Supervision [53.530957567507365]
In some real-world tasks, each training sample is associated with a candidate label set that contains one ground-truth label and some false positive labels.
In this paper, we formalize such problems as multi-instance partial-label learning (MIPL)
Existing multi-instance learning algorithms and partial-label learning algorithms are suboptimal for solving MIPL problems.
arXiv Detail & Related papers (2022-12-18T03:28:51Z) - Class-Incremental Lifelong Learning in Multi-Label Classification [3.711485819097916]
This paper studies Lifelong Multi-Label (LML) classification, which builds an online class-incremental classifier in a sequential multi-label classification data stream.
To solve the problem, the study proposes an Augmented Graph Convolutional Network (AGCN) with a built Augmented Correlation Matrix (ACM) across sequential partial-label tasks.
arXiv Detail & Related papers (2022-07-16T05:14:07Z) - Large Loss Matters in Weakly Supervised Multi-Label Classification [50.262533546999045]
We first regard unobserved labels as negative labels, casting the W task into noisy multi-label classification.
We propose novel methods for W which reject or correct the large loss samples to prevent model from memorizing the noisy label.
Our methodology actually works well, validating that treating large loss properly matters in a weakly supervised multi-label classification.
arXiv Detail & Related papers (2022-06-08T08:30:24Z) - Structured Semantic Transfer for Multi-Label Recognition with Partial
Labels [85.6967666661044]
We propose a structured semantic transfer (SST) framework that enables training multi-label recognition models with partial labels.
The framework consists of two complementary transfer modules that explore within-image and cross-image semantic correlations.
Experiments on the Microsoft COCO, Visual Genome and Pascal VOC datasets show that the proposed SST framework obtains superior performance over current state-of-the-art algorithms.
arXiv Detail & Related papers (2021-12-21T02:15:01Z) - MetaMIML: Meta Multi-Instance Multi-Label Learning [27.32606468640938]
We propose a network embedding and meta learning based approach to mine interdependent MIML objects of different types.
Experiments on benchmark datasets demonstrate that MetaMIML achieves a significantly better performance than state-of-the-art algorithms.
arXiv Detail & Related papers (2021-11-07T15:54:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.