Semi-Supervised Learning for Multi-Task Scene Understanding by Neural
Graph Consensus
- URL: http://arxiv.org/abs/2010.01086v2
- Date: Thu, 3 Dec 2020 15:31:41 GMT
- Title: Semi-Supervised Learning for Multi-Task Scene Understanding by Neural
Graph Consensus
- Authors: Marius Leordeanu, Mihai Pirvu, Dragos Costea, Alina Marcu, Emil
Slusanschi and Rahul Sukthankar
- Abstract summary: We address the problem of semi-supervised learning in the context of multiple visual interpretations of the world.
We show how prediction of different representations could be effectively learned through self-supervised consensus in our graph.
We also compare to state-of-the-art methods for multi-task and semi-supervised learning and show superior performance.
- Score: 23.528834793031894
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We address the challenging problem of semi-supervised learning in the context
of multiple visual interpretations of the world by finding consensus in a graph
of neural networks. Each graph node is a scene interpretation layer, while each
edge is a deep net that transforms one layer at one node into another from a
different node. During the supervised phase edge networks are trained
independently. During the next unsupervised stage edge nets are trained on the
pseudo-ground truth provided by consensus among multiple paths that reach the
nets' start and end nodes. These paths act as ensemble teachers for any given
edge and strong consensus is used for high-confidence supervisory signal. The
unsupervised learning process is repeated over several generations, in which
each edge becomes a "student" and also part of different ensemble "teachers"
for training other students. By optimizing such consensus between different
paths, the graph reaches consistency and robustness over multiple
interpretations and generations, in the face of unknown labels. We give
theoretical justifications of the proposed idea and validate it on a large
dataset. We show how prediction of different representations such as depth,
semantic segmentation, surface normals and pose from RGB input could be
effectively learned through self-supervised consensus in our graph. We also
compare to state-of-the-art methods for multi-task and semi-supervised learning
and show superior performance.
Related papers
- Contrastive Representation Learning Based on Multiple Node-centered
Subgraphs [11.416941835869649]
A single node intuitively has multiple node-centered subgraphs from the whole graph.
We propose a multiple node-centered subgraphs contrastive representation learning method to learn node representation on graphs in a self-supervised way.
arXiv Detail & Related papers (2023-08-31T04:04:09Z) - The Snowflake Hypothesis: Training Deep GNN with One Node One Receptive
field [39.679151680622375]
We introduce the Snowflake Hypothesis -- a novel paradigm underpinning the concept of one node, one receptive field''
We employ the simplest gradient and node-level cosine distance as guiding principles to regulate the aggregation depth for each node.
The observational results demonstrate that our hypothesis can serve as a universal operator for a range of tasks.
arXiv Detail & Related papers (2023-08-19T15:21:12Z) - Hierarchical Contrastive Learning Enhanced Heterogeneous Graph Neural
Network [59.860534520941485]
Heterogeneous graph neural networks (HGNNs) as an emerging technique have shown superior capacity of dealing with heterogeneous information network (HIN)
Recently, contrastive learning, a self-supervised method, becomes one of the most exciting learning paradigms and shows great potential when there are no labels.
In this paper, we study the problem of self-supervised HGNNs and propose a novel co-contrastive learning mechanism for HGNNs, named HeCo.
arXiv Detail & Related papers (2023-04-24T16:17:21Z) - Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph
Representation Learning [48.09362183184101]
We propose a novel self-supervised approach to learn node representations by enhancing Siamese self-distillation with multi-scale contrastive learning.
Our method achieves new state-of-the-art results and surpasses some semi-supervised counterparts by large margins.
arXiv Detail & Related papers (2021-05-12T14:20:13Z) - Graph Consistency based Mean-Teaching for Unsupervised Domain Adaptive
Person Re-Identification [54.58165777717885]
This paper proposes a Graph Consistency based Mean-Teaching (GCMT) method with constructing the Graph Consistency Constraint (GCC) between teacher and student networks.
Experiments on three datasets, i.e., Market-1501, DukeMTMCreID, and MSMT17, show that proposed GCMT outperforms state-of-the-art methods by clear margin.
arXiv Detail & Related papers (2021-05-11T04:09:49Z) - Unsupervised Domain Adaptation through Iterative Consensus Shift in a
Multi-Task Graph [22.308239339243272]
Babies learn with very little supervision by observing the surrounding world.
Our proposed multi-task graph, with consensus shift learning, relies only on pseudo-labels provided by expert models.
We validate our key contributions experimentally and demonstrate strong performance on the Replica dataset, superior to the very few published methods on multi-task learning with minimal supervision.
arXiv Detail & Related papers (2021-03-26T11:57:42Z) - AttrE2vec: Unsupervised Attributed Edge Representation Learning [22.774159996012276]
This paper proposes a novel unsupervised inductive method called AttrE2Vec, which learns a low-dimensional vector representation for edges in attributed networks.
Experimental results show that, compared to contemporary approaches, our method builds more powerful edge vector representations.
arXiv Detail & Related papers (2020-12-29T12:20:49Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Unsupervised Differentiable Multi-aspect Network Embedding [52.981277420394846]
We propose a novel end-to-end framework for multi-aspect network embedding, called asp2vec.
Our proposed framework can be readily extended to heterogeneous networks.
arXiv Detail & Related papers (2020-06-07T19:26:20Z) - Self-Supervised Graph Representation Learning via Global Context
Prediction [31.07584920486755]
This paper introduces a novel self-supervised strategy for graph representation learning by exploiting natural supervision provided by the data itself.
We randomly select pairs of nodes in a graph and train a well-designed neural net to predict the contextual position of one node relative to the other.
Our underlying hypothesis is that the representations learned from such within-graph context would capture the global topology of the graph and finely characterize the similarity and differentiation between nodes.
arXiv Detail & Related papers (2020-03-03T15:46:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.