Graph Tree Neural Networks
- URL: http://arxiv.org/abs/2111.00424v1
- Date: Sun, 31 Oct 2021 07:58:00 GMT
- Title: Graph Tree Neural Networks
- Authors: Seokjun Kim, Jaeeun Jang, Hee-seok Jung, Hyeoncheol Kim
- Abstract summary: Graph tree neural networks (GTNNs) are designed to solve the problems of existing networks by analyzing the structure of human neural networks.
In GTNNs, information units are related to the form of a graph and then they become a bigger unit of information again and have a relationship with other information units.
- Score: 0.43012765978447565
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have recently shown good performance in various
fields. In this paper, we propose graph tree neural networks (GTNNs) designed
to solve the problems of existing networks by analyzing the structure of human
neural networks. In GTNNs, information units are related to the form of a graph
and then they become a bigger unit of information again and have a relationship
with other information units. At this point, the unit of information is a set
of neurons, and we can express it as a vector with GTNN. Defining the starting
and ending points in a single graph is difficult, and a tree cannot express the
relationship among sibling nodes. However, a graph tree can be expressed using
leaf and root nodes as its starting and ending points and the relationship
among sibling nodes. Depth-first convolution (DFC) encodes the interaction
result from leaf nodes to the root node in a bottom-up approach, and
depth-first deconvolution (DFD) decodes the interaction result from the root
node to the leaf nodes in a top-down approach. GTNN is data-driven learning in
which the number of convolutions varies according to the depth of the tree.
Moreover, learning features of different types together is possible.
Supervised, unsupervised, and semi-supervised learning using graph tree
recursive neural network (GTR) , graph tree recursive attention networks
(GTRAs), and graph tree recursive autoencoders (GTRAEs) are introduced in this
paper. We experimented with a simple toy test with source code dataset.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - GTNet: A Tree-Based Deep Graph Learning Architecture [8.50892442127182]
We propose a deep graph learning architecture with a new general message passing scheme that originates from the tree representation of graphs.
Two graph representation learning models are proposed within this GTNet architecture - Graph Tree Attention Network (GTAN) and Graph Tree Convolution Network (GTCN)
arXiv Detail & Related papers (2022-04-27T09:43:14Z) - Deformable Graph Convolutional Networks [12.857403315970231]
Graph neural networks (GNNs) have significantly improved representation power for graph-structured data.
In this paper, we propose Deformable Graph Convolutional Networks (Deformable GCNs) that adaptively perform convolution in multiple latent spaces.
Our framework simultaneously learns the node positional embeddings to determine the relations between nodes in an end-to-end fashion.
arXiv Detail & Related papers (2021-12-29T07:55:29Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Simplicial Convolutional Neural Networks [36.078200422283835]
Recently, signal processing and neural networks have been extended to process and learn from data on graphs.
We propose a simplicial convolutional neural network (SCNN) architecture to learn from data defined on simplices.
arXiv Detail & Related papers (2021-10-06T08:52:55Z) - Reasoning Graph Networks for Kinship Verification: from Star-shaped to
Hierarchical [85.0376670244522]
We investigate the problem of facial kinship verification by learning hierarchical reasoning graph networks.
We develop a Star-shaped Reasoning Graph Network (S-RGN) to exploit more powerful and flexible capacity.
We also develop a Hierarchical Reasoning Graph Network (H-RGN) to exploit more powerful and flexible capacity.
arXiv Detail & Related papers (2021-09-06T03:16:56Z) - Neural Trees for Learning on Graphs [19.05038106825347]
Graph Neural Networks (GNNs) have emerged as a flexible and powerful approach for learning over graphs.
We propose a new GNN architecture -- the Neural Tree.
We show that the neural tree architecture can approximate any smooth probability distribution function over an undirected graph.
arXiv Detail & Related papers (2021-05-15T17:08:20Z) - TreeRNN: Topology-Preserving Deep GraphEmbedding and Learning [24.04035265351755]
We study the methods to transfer the graphs into trees so that explicit orders are learned to direct the feature integration from local to global.
To best learn the patterns from the graph-tree-images, we propose TreeRNN, a 2D RNN architecture that recurrently integrates the image pixels by rows and columns to help classify the graph categories.
arXiv Detail & Related papers (2020-06-21T15:22:24Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.