A Deep Positive-Negative Prototype Approach to Integrated Prototypical Discriminative Learning
- URL: http://arxiv.org/abs/2501.02477v1
- Date: Sun, 05 Jan 2025 08:24:31 GMT
- Title: A Deep Positive-Negative Prototype Approach to Integrated Prototypical Discriminative Learning
- Authors: Ramin Zarei-Sabzevar, Ahad Harati,
- Abstract summary: This paper proposes a novel Deep Positive-Negative Prototype (DPNP) model that combines prototype-based learning (PbL) with discriminative methods to improve class compactness and separability in deep neural networks.
We show that DPNP can organize prototypes in nearly regular positions within feature space, such that it is possible to achieve competitive classification accuracy even in much lower-dimensional feature spaces.
- Score: 0.30693357740321775
- License:
- Abstract: This paper proposes a novel Deep Positive-Negative Prototype (DPNP) model that combines prototype-based learning (PbL) with discriminative methods to improve class compactness and separability in deep neural networks. While PbL traditionally emphasizes interpretability by classifying samples based on their similarity to representative prototypes, it struggles with creating optimal decision boundaries in complex scenarios. Conversely, discriminative methods effectively separate classes but often lack intuitive interpretability. Toward exploiting advantages of these two approaches, the suggested DPNP model bridges between them by unifying class prototypes with weight vectors, thereby establishing a structured latent space that enables accurate classification using interpretable prototypes alongside a properly learned feature representation. Based on this central idea of unified prototype-weight representation, Deep Positive Prototype (DPP) is formed in the latent space as a representative for each class using off-the-shelf deep networks as feature extractors. Then, rival neighboring class DPPs are treated as implicit negative prototypes with repulsive force in DPNP, which push away DPPs from each other. This helps to enhance inter-class separation without the need for any extra parameters. Hence, through a novel loss function that integrates cross-entropy, prototype alignment, and separation terms, DPNP achieves well-organized feature space geometry, maximizing intra-class compactness and inter-class margins. We show that DPNP can organize prototypes in nearly regular positions within feature space, such that it is possible to achieve competitive classification accuracy even in much lower-dimensional feature spaces. Experimental results on several datasets demonstrate that DPNP outperforms state-of-the-art models, while using smaller networks.
Related papers
- Learning Clustering-based Prototypes for Compositional Zero-shot Learning [56.57299428499455]
ClusPro is a robust clustering-based prototype mining framework for Compositional Zero-Shot Learning.
It defines the conceptual boundaries of primitives through a set of diversified prototypes.
ClusPro efficiently performs prototype clustering in a non-parametric fashion without the introduction of additional learnable parameters.
arXiv Detail & Related papers (2025-02-10T14:20:01Z) - An Enhanced Federated Prototype Learning Method under Domain Shift [36.73020712815063]
Federated Learning (FL) allows collaborative machine learning training without sharing private data.
Recent paper introduces variance-aware dual-level prototype clustering and uses a novel $alpha$-sparsity prototype loss.
Evaluations on the Digit-5, Office-10, and DomainNet datasets show that our method performs better than existing approaches.
arXiv Detail & Related papers (2024-09-27T09:28:27Z) - Prototype Fission: Closing Set for Robust Open-set Semi-supervised
Learning [6.645479471664253]
Semi-supervised Learning (SSL) has been proven vulnerable to out-of-distribution (OOD) samples in realistic large-scale unsupervised datasets.
We propose Prototype Fission(PF) to divide class-wise latent spaces into compact sub-spaces by automatic fine-grained latent space mining.
arXiv Detail & Related papers (2023-08-29T19:04:42Z) - Learning Support and Trivial Prototypes for Interpretable Image
Classification [19.00622056840535]
Prototypical part network (ProtoPNet) methods have been designed to achieve interpretable classification.
We aim to improve the classification of ProtoPNet with a new method to learn support prototypes that lie near the classification boundary in the feature space.
arXiv Detail & Related papers (2023-01-08T09:27:41Z) - Automatically Discovering Novel Visual Categories with Self-supervised
Prototype Learning [68.63910949916209]
This paper tackles the problem of novel category discovery (NCD), which aims to discriminate unknown categories in large-scale image collections.
We propose a novel adaptive prototype learning method consisting of two main stages: prototypical representation learning and prototypical self-training.
We conduct extensive experiments on four benchmark datasets and demonstrate the effectiveness and robustness of the proposed method with state-of-the-art performance.
arXiv Detail & Related papers (2022-08-01T16:34:33Z) - Rethinking Semantic Segmentation: A Prototype View [126.59244185849838]
We present a nonparametric semantic segmentation model based on non-learnable prototypes.
Our framework yields compelling results over several datasets.
We expect this work will provoke a rethink of the current de facto semantic segmentation model design.
arXiv Detail & Related papers (2022-03-28T21:15:32Z) - Dual Prototypical Contrastive Learning for Few-shot Semantic
Segmentation [55.339405417090084]
We propose a dual prototypical contrastive learning approach tailored to the few-shot semantic segmentation (FSS) task.
The main idea is to encourage the prototypes more discriminative by increasing inter-class distance while reducing intra-class distance in prototype feature space.
We demonstrate that the proposed dual contrastive learning approach outperforms state-of-the-art FSS methods on PASCAL-5i and COCO-20i datasets.
arXiv Detail & Related papers (2021-11-09T08:14:50Z) - Prototype-based interpretation of the functionality of neurons in
winner-take-all neural networks [1.418033127602866]
Prototype-based learning (PbL) using a winner-take-all (WTA) network based on minimum Euclidean distance (ED-WTA) is an intuitive approach to multiclass classification.
We propose a novel training algorithm for the $pm$ED-WTA network, which cleverly switches between updating the positive and negative prototypes.
We show that the proposed $pm$ED-WTA method constructs highly interpretable prototypes that can be successfully used for detecting and adversarial examples.
arXiv Detail & Related papers (2020-08-20T03:15:37Z) - LFD-ProtoNet: Prototypical Network Based on Local Fisher Discriminant
Analysis for Few-shot Learning [98.64231310584614]
The prototypical network (ProtoNet) is a few-shot learning framework that performs metric learning and classification using the distance to prototype representations of each class.
We show the usefulness of the proposed method by theoretically providing an expected risk bound and empirically demonstrating its superior classification accuracy on miniImageNet and tieredImageNet.
arXiv Detail & Related papers (2020-06-15T11:56:30Z) - Prototypical Contrastive Learning of Unsupervised Representations [171.3046900127166]
Prototypical Contrastive Learning (PCL) is an unsupervised representation learning method.
PCL implicitly encodes semantic structures of the data into the learned embedding space.
PCL outperforms state-of-the-art instance-wise contrastive learning methods on multiple benchmarks.
arXiv Detail & Related papers (2020-05-11T09:53:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.