Generalized Inter-class Loss for Gait Recognition
- URL: http://arxiv.org/abs/2210.06779v1
- Date: Thu, 13 Oct 2022 06:44:53 GMT
- Title: Generalized Inter-class Loss for Gait Recognition
- Authors: Weichen Yu, Hongyuan Yu, Yan Huang, Liang Wang
- Abstract summary: Gait recognition is a unique biometric technique that can be performed at a long distance non-cooperatively.
Previous gait works focus more on minimizing the intra-class variance while ignoring the significance in constraining inter-class variance.
We propose a generalized inter-class loss which resolves the inter-class variance from both sample-level feature distribution and class-level feature distribution.
- Score: 11.15855312510806
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gait recognition is a unique biometric technique that can be performed at a
long distance non-cooperatively and has broad applications in public safety and
intelligent traffic systems. Previous gait works focus more on minimizing the
intra-class variance while ignoring the significance in constraining
inter-class variance. To this end, we propose a generalized inter-class loss
which resolves the inter-class variance from both sample-level feature
distribution and class-level feature distribution. Instead of equal penalty
strength on pair scores, the proposed loss optimizes sample-level inter-class
feature distribution by dynamically adjusting the pairwise weight. Further, in
class-level distribution, generalized inter-class loss adds a constraint on the
uniformity of inter-class feature distribution, which forces the feature
representations to approximate a hypersphere and keep maximal inter-class
variance. In addition, the proposed method automatically adjusts the margin
between classes which enables the inter-class feature distribution to be more
flexible. The proposed method can be generalized to different gait recognition
networks and achieves significant improvements. We conduct a series of
experiments on CASIA-B and OUMVLP, and the experimental results show that the
proposed loss can significantly improve the performance and achieves the
state-of-the-art performances.
Related papers
- Anti-Collapse Loss for Deep Metric Learning Based on Coding Rate Metric [99.19559537966538]
DML aims to learn a discriminative high-dimensional embedding space for downstream tasks like classification, clustering, and retrieval.
To maintain the structure of embedding space and avoid feature collapse, we propose a novel loss function called Anti-Collapse Loss.
Comprehensive experiments on benchmark datasets demonstrate that our proposed method outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2024-07-03T13:44:20Z) - Boosting Few-Shot Learning via Attentive Feature Regularization [35.4031662352264]
Few-shot learning (FSL) based on manifold regularization aims to improve the recognition capacity of novel objects with limited training samples.
This paper proposes feature regularization (AFR) which aims to improve the feature representativeness and discriminability.
arXiv Detail & Related papers (2024-03-23T14:36:48Z) - A Novel Cross-Perturbation for Single Domain Generalization [54.612933105967606]
Single domain generalization aims to enhance the ability of the model to generalize to unknown domains when trained on a single source domain.
The limited diversity in the training data hampers the learning of domain-invariant features, resulting in compromised generalization performance.
We propose CPerb, a simple yet effective cross-perturbation method to enhance the diversity of the training data.
arXiv Detail & Related papers (2023-08-02T03:16:12Z) - Learning Prompt-Enhanced Context Features for Weakly-Supervised Video
Anomaly Detection [37.99031842449251]
Video anomaly detection under weak supervision presents significant challenges.
We present a weakly supervised anomaly detection framework that focuses on efficient context modeling and enhanced semantic discriminability.
Our approach significantly improves the detection accuracy of certain anomaly sub-classes, underscoring its practical value and efficacy.
arXiv Detail & Related papers (2023-06-26T06:45:16Z) - Semi-supervised Domain Adaptive Structure Learning [72.01544419893628]
Semi-supervised domain adaptation (SSDA) is a challenging problem requiring methods to overcome both 1) overfitting towards poorly annotated data and 2) distribution shift across domains.
We introduce an adaptive structure learning method to regularize the cooperation of SSL and DA.
arXiv Detail & Related papers (2021-12-12T06:11:16Z) - Improving Music Performance Assessment with Contrastive Learning [78.8942067357231]
This study investigates contrastive learning as a potential method to improve existing MPA systems.
We introduce a weighted contrastive loss suitable for regression tasks applied to a convolutional neural network.
Our results show that contrastive-based methods are able to match and exceed SoTA performance for MPA regression tasks.
arXiv Detail & Related papers (2021-08-03T19:24:25Z) - Intra-Class Uncertainty Loss Function for Classification [6.523198497365588]
intra-class uncertainty/variability is not considered, especially for datasets containing unbalanced classes.
In our framework, the features extracted by deep networks of each class are characterized by independent Gaussian distribution.
The proposed approach shows improved classification performance, through learning a better class representation.
arXiv Detail & Related papers (2021-04-12T09:02:41Z) - Margin Preserving Self-paced Contrastive Learning Towards Domain
Adaptation for Medical Image Segmentation [51.93711960601973]
We propose a novel margin preserving self-paced contrastive Learning model for cross-modal medical image segmentation.
With the guidance of progressively refined semantic prototypes, a novel margin preserving contrastive loss is proposed to boost the discriminability of embedded representation space.
Experiments on cross-modal cardiac segmentation tasks demonstrate that MPSCL significantly improves semantic segmentation performance.
arXiv Detail & Related papers (2021-03-15T15:23:10Z) - Unsupervised Domain Adaptation in Semantic Segmentation via Orthogonal
and Clustered Embeddings [25.137859989323537]
We propose an effective Unsupervised Domain Adaptation (UDA) strategy, based on a feature clustering method.
We introduce two novel learning objectives to enhance the discriminative clustering performance.
arXiv Detail & Related papers (2020-11-25T10:06:22Z) - Beyond cross-entropy: learning highly separable feature distributions
for robust and accurate classification [22.806324361016863]
We propose a novel approach for training deep robust multiclass classifiers that provides adversarial robustness.
We show that the regularization of the latent space based on our approach yields excellent classification accuracy.
arXiv Detail & Related papers (2020-10-29T11:15:17Z) - Generalized Zero-Shot Learning Via Over-Complete Distribution [79.5140590952889]
We propose to generate an Over-Complete Distribution (OCD) using Conditional Variational Autoencoder (CVAE) of both seen and unseen classes.
The effectiveness of the framework is evaluated using both Zero-Shot Learning and Generalized Zero-Shot Learning protocols.
arXiv Detail & Related papers (2020-04-01T19:05:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.