High-order structure preserving graph neural network for few-shot
learning
- URL: http://arxiv.org/abs/2005.14415v1
- Date: Fri, 29 May 2020 06:38:51 GMT
- Title: High-order structure preserving graph neural network for few-shot
learning
- Authors: Guangfeng Lin, Ying Yang, Yindi Fan, Xiaobing Kang, Kaiyang Liao, and
Fan Zhao
- Abstract summary: Few-shot learning can find the latent structure information between the prior knowledge and the queried data by the similarity metric of meta-learning.
Most existing methods try to model the similarity relationship of the samples in the intra tasks, and generalize the model to identify the new categories.
The proposed high-order structure preserving graph neural network(HOSP-GNN) can explore the rich structure of the samples to predict the label of the queried data on graph.
- Score: 10.296473510866228
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Few-shot learning can find the latent structure information between the prior
knowledge and the queried data by the similarity metric of meta-learning to
construct the discriminative model for recognizing the new categories with the
rare labeled samples. Most existing methods try to model the similarity
relationship of the samples in the intra tasks, and generalize the model to
identify the new categories. However, the relationship of samples between the
separated tasks is difficultly considered because of the different metric
criterion in the respective tasks. In contrast, the proposed high-order
structure preserving graph neural network(HOSP-GNN) can further explore the
rich structure of the samples to predict the label of the queried data on graph
that enables the structure evolution to explicitly discriminate the categories
by iteratively updating the high-order structure relationship (the relative
metric in multi-samples,instead of pairwise sample metric) with the manifold
structure constraints. HOSP-GNN can not only mine the high-order structure for
complementing the relevance between samples that may be divided into the
different task in meta-learning, and but also generate the rule of the
structure updating by manifold constraint. Furthermore, HOSP-GNN doesn't need
retrain the learning model for recognizing the new classes, and HOSP-GNN has
the well-generalizable high-order structure for model adaptability. Experiments
show that HOSP-GNN outperforms the state-of-the-art methods on supervised and
semi-supervised few-shot learning in three benchmark datasets that are
miniImageNet, tieredImageNet and FC100.
Related papers
- Subgraph Clustering and Atom Learning for Improved Image Classification [4.499833362998488]
We present the Graph Sub-Graph Network (GSN), a novel hybrid image classification model merging the strengths of Convolutional Neural Networks (CNNs) for feature extraction and Graph Neural Networks (GNNs) for structural modeling.
GSN employs k-means clustering to group graph nodes into clusters, facilitating the creation of subgraphs.
These subgraphs are then utilized to learn representative atoms for dictionary learning, enabling the identification of sparse, class-distinguishable features.
arXiv Detail & Related papers (2024-07-20T06:32:00Z) - Learning Invariant Representations of Graph Neural Networks via Cluster
Generalization [58.68231635082891]
Graph neural networks (GNNs) have become increasingly popular in modeling graph-structured data.
In this paper, we experimentally find that the performance of GNNs drops significantly when the structure shift happens.
We propose the Cluster Information Transfer (CIT) mechanism, which can learn invariant representations for GNNs.
arXiv Detail & Related papers (2024-03-06T10:36:56Z) - Class-level Structural Relation Modelling and Smoothing for Visual
Representation Learning [12.247343963572732]
This paper presents a framework termed bfClass-level Structural Relation Modeling and Smoothing for Visual Representation Learning (CSRMS)
It includes the Class-level Relation Modelling, Class-aware GraphGuided Sampling, and Graph-Guided Representation Learning modules.
Experiments demonstrate the effectiveness of structured knowledge modelling for enhanced representation learning and show that CSRMS can be incorporated with any state-of-the-art visual representation learning models for performance gains.
arXiv Detail & Related papers (2023-08-08T09:03:46Z) - Multi-Scale Semantics-Guided Neural Networks for Efficient
Skeleton-Based Human Action Recognition [140.18376685167857]
A simple yet effective multi-scale semantics-guided neural network is proposed for skeleton-based action recognition.
MS-SGN achieves the state-of-the-art performance on the NTU60, NTU120, and SYSU datasets.
arXiv Detail & Related papers (2021-11-07T03:50:50Z) - Pareto-wise Ranking Classifier for Multi-objective Evolutionary Neural
Architecture Search [15.454709248397208]
This study focuses on how to find feasible deep models under diverse design objectives.
We propose a classification-wise Pareto evolution approach for one-shot NAS, where an online classifier is trained to predict the dominance relationship between the candidate and constructed reference architectures.
We find a number of neural architectures with different model sizes ranging from 2M to 6M under diverse objectives and constraints.
arXiv Detail & Related papers (2021-09-14T13:28:07Z) - Learning Hierarchical Graph Neural Networks for Image Clustering [81.5841862489509]
We propose a hierarchical graph neural network (GNN) model that learns how to cluster a set of images into an unknown number of identities.
Our hierarchical GNN uses a novel approach to merge connected components predicted at each level of the hierarchy to form a new graph at the next level.
arXiv Detail & Related papers (2021-07-03T01:28:42Z) - Redefining Neural Architecture Search of Heterogeneous Multi-Network
Models by Characterizing Variation Operators and Model Components [71.03032589756434]
We investigate the effect of different variation operators in a complex domain, that of multi-network heterogeneous neural models.
We characterize both the variation operators, according to their effect on the complexity and performance of the model; and the models, relying on diverse metrics which estimate the quality of the different parts composing it.
arXiv Detail & Related papers (2021-06-16T17:12:26Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - ATRM: Attention-based Task-level Relation Module for GNN-based Few-shot
Learning [14.464964336101028]
We propose a new relation measure method, namely the attention-based task-level relation module (ATRM)
The proposed module captures the relation representations between nodes by considering the sample-to-task instead of sample-to-sample embedding features.
Experimental results demonstrate that the proposed module is effective for GNN-based few-shot learning.
arXiv Detail & Related papers (2021-01-25T00:53:04Z) - Evolutionary Architecture Search for Graph Neural Networks [23.691915813153496]
We propose a novel AutoML framework through the evolution of individual models in a large Graph Neural Networks (GNN) architecture space.
To the best of our knowledge, this is the first work to introduce and evaluate evolutionary architecture search for GNN models.
arXiv Detail & Related papers (2020-09-21T22:11:53Z) - ACDC: Weight Sharing in Atom-Coefficient Decomposed Convolution [57.635467829558664]
We introduce a structural regularization across convolutional kernels in a CNN.
We show that CNNs now maintain performance with dramatic reduction in parameters and computations.
arXiv Detail & Related papers (2020-09-04T20:41:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.