Representative Graph Neural Network
- URL: http://arxiv.org/abs/2008.05202v1
- Date: Wed, 12 Aug 2020 09:46:52 GMT
- Title: Representative Graph Neural Network
- Authors: Changqian Yu, Yifan Liu, Changxin Gao, Chunhua Shen, Nong Sang
- Abstract summary: We present a Representative Graph layer to dynamically sample a few representative features.
Instead of propagating the messages from all positions, our RepGraph layer computes the response of one node merely with a few representative nodes.
- Score: 113.67254049938629
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-local operation is widely explored to model the long-range dependencies.
However, the redundant computation in this operation leads to a prohibitive
complexity. In this paper, we present a Representative Graph (RepGraph) layer
to dynamically sample a few representative features, which dramatically reduces
redundancy. Instead of propagating the messages from all positions, our
RepGraph layer computes the response of one node merely with a few
representative nodes. The locations of representative nodes come from a learned
spatial offset matrix. The RepGraph layer is flexible to integrate into many
visual architectures and combine with other operations. With the application of
semantic segmentation, without any bells and whistles, our RepGraph network can
compete or perform favourably against the state-of-the-art methods on three
challenging benchmarks: ADE20K, Cityscapes, and PASCAL-Context datasets. In the
task of object detection, our RepGraph layer can also improve the performance
on the COCO dataset compared to the non-local operation. Code is available at
https://git.io/RepGraph.
Related papers
- HUGE: Huge Unsupervised Graph Embeddings with TPUs [6.108914274067702]
Graph Embedding is a process of creating a continuous representation of nodes in a graph.
A high-performance graph embedding architecture leveraging amounts of high-bandwidth memory is presented.
We verify the embedding space quality on real and synthetic large-scale datasets.
arXiv Detail & Related papers (2023-07-26T20:29:15Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Long Range Graph Benchmark [32.317725340138104]
MP-GNNs that simply rely on 1-hop message passing often fare better in several existing graph benchmarks.
We benchmark both baseline GNNs and Graph Transformer networks to verify that the models which capture long-range dependencies perform significantly better on these tasks.
arXiv Detail & Related papers (2022-06-16T13:33:22Z) - Inferential SIR-GN: Scalable Graph Representation Learning [0.4699313647907615]
Graph representation learning methods generate numerical vector representations for the nodes in a network.
In this work, we propose Inferential SIR-GN, a model which is pre-trained on random graphs, then computes node representations rapidly.
We demonstrate that the model is able to capture node's structural role information, and show excellent performance at node and graph classification tasks, on unseen networks.
arXiv Detail & Related papers (2021-11-08T20:56:37Z) - SoGCN: Second-Order Graph Convolutional Networks [20.840026487716404]
We show that multi-layer second-order graph convolution (SoGC) is sufficient to attain the ability of expressing spectral filters with arbitrary coefficients.
We build our Second-Order Graph Convolutional Networks (SoGCN) with SoGC and design a synthetic dataset to verify its filter fitting capability.
arXiv Detail & Related papers (2021-10-14T03:56:34Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - Sequential Graph Convolutional Network for Active Learning [53.99104862192055]
We propose a novel pool-based Active Learning framework constructed on a sequential Graph Convolution Network (GCN)
With a small number of randomly sampled images as seed labelled examples, we learn the parameters of the graph to distinguish labelled vs unlabelled nodes.
We exploit these characteristics of GCN to select the unlabelled examples which are sufficiently different from labelled ones.
arXiv Detail & Related papers (2020-06-18T00:55:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.