Affinity-Based Hierarchical Learning of Dependent Concepts for Human
Activity Recognition
- URL: http://arxiv.org/abs/2104.04889v1
- Date: Sun, 11 Apr 2021 01:08:48 GMT
- Title: Affinity-Based Hierarchical Learning of Dependent Concepts for Human
Activity Recognition
- Authors: Aomar Osmani, Massinissa Hamidi, Pegah Alizadeh
- Abstract summary: We show that the organization of overlapping classes into hierarchies considerably improves classification performances.
This is particularly true in the case of activity recognition tasks featured in the SHL dataset.
We propose an approach based on transfer affinity among the classes to determine an optimal hierarchy for the learning process.
- Score: 6.187780920448871
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In multi-class classification tasks, like human activity recognition, it is
often assumed that classes are separable. In real applications, this assumption
becomes strong and generates inconsistencies. Besides, the most commonly used
approach is to learn classes one-by-one against the others. This computational
simplification principle introduces strong inductive biases on the learned
theories. In fact, the natural connections among some classes, and not others,
deserve to be taken into account. In this paper, we show that the organization
of overlapping classes (multiple inheritances) into hierarchies considerably
improves classification performances. This is particularly true in the case of
activity recognition tasks featured in the SHL dataset. After theoretically
showing the exponential complexity of possible class hierarchies, we propose an
approach based on transfer affinity among the classes to determine an optimal
hierarchy for the learning process. Extensive experiments show improved
performances and a reduction in the number of examples needed to learn.
Related papers
- On Transfer in Classification: How Well do Subsets of Classes
Generalize? [6.38421840998693]
In classification, it is usual to observe that models trained on a given set of classes can generalize to previously unseen ones.
This ability is often leveraged in the context of transfer learning where a pretrained model can be used to process new classes.
In this work, we are interested in laying the foundations of such a theoretical framework for transferability between sets of classes.
arXiv Detail & Related papers (2024-03-06T09:25:22Z) - TaxoKnow: Taxonomy as Prior Knowledge in the Loss Function of
Multi-class Classification [1.130757825611188]
We introduce two methods to integrate the hierarchical taxonomy as an explicit regularizer into the loss function of learning algorithms.
By reasoning on a hierarchical taxonomy, a neural network alleviates its output distributions over the classes, allowing conditioning on upper concepts for a minority class.
arXiv Detail & Related papers (2023-05-24T08:08:56Z) - Synergies between Disentanglement and Sparsity: Generalization and
Identifiability in Multi-Task Learning [79.83792914684985]
We prove a new identifiability result that provides conditions under which maximally sparse base-predictors yield disentangled representations.
Motivated by this theoretical result, we propose a practical approach to learn disentangled representations based on a sparsity-promoting bi-level optimization problem.
arXiv Detail & Related papers (2022-11-26T21:02:09Z) - Long-tail Recognition via Compositional Knowledge Transfer [60.03764547406601]
We introduce a novel strategy for long-tail recognition that addresses the tail classes' few-shot problem.
Our objective is to transfer knowledge acquired from information-rich common classes to semantically similar, and yet data-hungry, rare classes.
Experiments show that our approach can achieve significant performance boosts on rare classes while maintaining robust common class performance.
arXiv Detail & Related papers (2021-12-13T15:48:59Z) - Open-Set Representation Learning through Combinatorial Embedding [62.05670732352456]
We are interested in identifying novel concepts in a dataset through representation learning based on the examples in both labeled and unlabeled classes.
We propose a learning approach, which naturally clusters examples in unseen classes using the compositional knowledge given by multiple supervised meta-classifiers on heterogeneous label spaces.
The proposed algorithm discovers novel concepts via a joint optimization of enhancing the discrimitiveness of unseen classes as well as learning the representations of known classes generalizable to novel ones.
arXiv Detail & Related papers (2021-06-29T11:51:57Z) - ECKPN: Explicit Class Knowledge Propagation Network for Transductive
Few-shot Learning [53.09923823663554]
Class-level knowledge can be easily learned by humans from just a handful of samples.
We propose an Explicit Class Knowledge Propagation Network (ECKPN) to address this problem.
We conduct extensive experiments on four few-shot classification benchmarks, and the experimental results show that the proposed ECKPN significantly outperforms the state-of-the-art methods.
arXiv Detail & Related papers (2021-06-16T02:29:43Z) - Inducing a hierarchy for multi-class classification problems [11.58041597483471]
In applications where categorical labels follow a natural hierarchy, classification methods that exploit the label structure often outperform those that do not.
In this paper, we investigate a class of methods that induce a hierarchy that can similarly improve classification performance over flat classifiers.
We demonstrate the effectiveness of the class of methods both for discovering a latent hierarchy and for improving accuracy in principled simulation settings and three real data applications.
arXiv Detail & Related papers (2021-02-20T05:40:42Z) - Pitfalls of Assessing Extracted Hierarchies for Multi-Class
Classification [4.89253144446913]
We identify some common pitfalls that may lead practitioners to make misleading conclusions about their methods.
We show how the hierarchy's quality can become irrelevant depending on the experimental setup.
Our results confirm that datasets with a high number of classes generally present complex structures in how these classes relate to each other.
arXiv Detail & Related papers (2021-01-26T21:50:57Z) - Theoretical Insights Into Multiclass Classification: A High-dimensional
Asymptotic View [82.80085730891126]
We provide the first modernally precise analysis of linear multiclass classification.
Our analysis reveals that the classification accuracy is highly distribution-dependent.
The insights gained may pave the way for a precise understanding of other classification algorithms.
arXiv Detail & Related papers (2020-11-16T05:17:29Z) - Learning and Evaluating Representations for Deep One-class
Classification [59.095144932794646]
We present a two-stage framework for deep one-class classification.
We first learn self-supervised representations from one-class data, and then build one-class classifiers on learned representations.
In experiments, we demonstrate state-of-the-art performance on visual domain one-class classification benchmarks.
arXiv Detail & Related papers (2020-11-04T23:33:41Z) - Evolutionary Simplicial Learning as a Generative and Compact Sparse
Framework for Classification [0.0]
Simplicial learning is an adaptation of dictionary learning, where subspaces become clipped and acquire arbitrary offsets.
This paper proposes an evolutionary simplicial learning method as a generative and compact sparse framework for classification.
arXiv Detail & Related papers (2020-05-14T15:44:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.