End-To-End Graph-based Deep Semi-Supervised Learning
- URL: http://arxiv.org/abs/2002.09891v1
- Date: Sun, 23 Feb 2020 12:32:08 GMT
- Title: End-To-End Graph-based Deep Semi-Supervised Learning
- Authors: Zihao Wang, Enmei Tu, Zhou Meng
- Abstract summary: The quality of a graph is determined jointly by three key factors of the graph nodes, edges and similarity measure (or edge weights)
We propose a novel graph-based semi-supervised learning approach to optimize all three factors simultaneously in an end-to-end learning fashion.
- Score: 7.151859287072378
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The quality of a graph is determined jointly by three key factors of the
graph: nodes, edges and similarity measure (or edge weights), and is very
crucial to the success of graph-based semi-supervised learning (SSL)
approaches. Recently, dynamic graph, which means part/all its factors are
dynamically updated during the training process, has demonstrated to be
promising for graph-based semi-supervised learning. However, existing
approaches only update part of the three factors and keep the rest manually
specified during the learning stage. In this paper, we propose a novel
graph-based semi-supervised learning approach to optimize all three factors
simultaneously in an end-to-end learning fashion. To this end, we concatenate
two neural networks (feature network and similarity network) together to learn
the categorical label and semantic similarity, respectively, and train the
networks to minimize a unified SSL objective function. We also introduce an
extended graph Laplacian regularization term to increase training efficiency.
Extensive experiments on several benchmark datasets demonstrate the
effectiveness of our approach.
Related papers
- Universal Graph Continual Learning [22.010954622073598]
We focus on a universal approach wherein each data point in a task can be a node or a graph, and the task varies from node to graph classification.
We propose a novel method that enables graph neural networks to excel in this universal setting.
arXiv Detail & Related papers (2023-08-27T01:19:19Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Towards Relation-centered Pooling and Convolution for Heterogeneous
Graph Learning Networks [11.421162988355146]
Heterogeneous graph neural network has unleashed great potential on graph representation learning.
We design a relation-centered Pooling and Convolution for Heterogeneous Graph learning Network, namely PC-HGN, to enable relation-specific sampling and cross-relation convolutions.
We evaluate the performance of the proposed model by comparing with state-of-the-art graph learning models on three different real-world datasets.
arXiv Detail & Related papers (2022-10-31T08:43:32Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Machine Learning on Graphs: A Model and Comprehensive Taxonomy [22.73365477040205]
We bridge the gap between graph neural networks, network embedding and graph regularization models.
Specifically, we propose a Graph Decoder Model (GRAPHEDM), which generalizes popular algorithms for semi-supervised learning on graphs.
arXiv Detail & Related papers (2020-05-07T18:00:02Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.