Graph Structural-topic Neural Network
- URL: http://arxiv.org/abs/2006.14278v2
- Date: Sat, 4 Jul 2020 15:11:41 GMT
- Title: Graph Structural-topic Neural Network
- Authors: Qingqing Long, Yilun Jin, Guojie Song, Yi Li, Wei Lin
- Abstract summary: Graph Convolutional Networks (GCNs) achieved tremendous success by effectively gathering local features for nodes.
In this paper, we propose Graph Structural-topic Neural Network, abbreviated GraphSTONE, a GCN model that utilizes topic models of graphs.
We design multi-view GCNs to unify node features and structural topic features and utilize structural topics to guide the aggregation.
- Score: 35.27112594356742
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Convolutional Networks (GCNs) achieved tremendous success by
effectively gathering local features for nodes. However, commonly do GCNs focus
more on node features but less on graph structures within the neighborhood,
especially higher-order structural patterns. However, such local structural
patterns are shown to be indicative of node properties in numerous fields. In
addition, it is not just single patterns, but the distribution over all these
patterns matter, because networks are complex and the neighborhood of each node
consists of a mixture of various nodes and structural patterns.
Correspondingly, in this paper, we propose Graph Structural-topic Neural
Network, abbreviated GraphSTONE, a GCN model that utilizes topic models of
graphs, such that the structural topics capture indicative graph structures
broadly from a probabilistic aspect rather than merely a few structures.
Specifically, we build topic models upon graphs using anonymous walks and Graph
Anchor LDA, an LDA variant that selects significant structural patterns first,
so as to alleviate the complexity and generate structural topics efficiently.
In addition, we design multi-view GCNs to unify node features and structural
topic features and utilize structural topics to guide the aggregation. We
evaluate our model through both quantitative and qualitative experiments, where
our model exhibits promising performance, high efficiency, and clear
interpretability.
Related papers
- Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting [50.181824673039436]
We propose a Graph Structure Self-Contrasting (GSSC) framework that learns graph structural information without message passing.
The proposed framework is based purely on Multi-Layer Perceptrons (MLPs), where the structural information is only implicitly incorporated as prior knowledge.
It first applies structural sparsification to remove potentially uninformative or noisy edges in the neighborhood, and then performs structural self-contrasting in the sparsified neighborhood to learn robust node representations.
arXiv Detail & Related papers (2024-09-09T12:56:02Z) - Harnessing Collective Structure Knowledge in Data Augmentation for Graph Neural Networks [25.12261412297796]
Graph neural networks (GNNs) have achieved state-of-the-art performance in graph representation learning.
We propose a novel approach, namely collective structure knowledge-augmented graph neural network (CoS-GNN)
arXiv Detail & Related papers (2024-05-17T08:50:00Z) - SE-GSL: A General and Effective Graph Structure Learning Framework
through Structural Entropy Optimization [67.28453445927825]
Graph Neural Networks (GNNs) are de facto solutions to structural data learning.
Existing graph structure learning (GSL) frameworks still lack robustness and interpretability.
This paper proposes a general GSL framework, SE-GSL, through structural entropy and the graph hierarchy abstracted in the encoding tree.
arXiv Detail & Related papers (2023-03-17T05:20:24Z) - Ego-based Entropy Measures for Structural Representations on Graphs [35.55543331773255]
VNEstruct is a simple approach, based on entropy measures of the neighborhood's topology, for generating low-dimensional structural representations.
VNEstruct can achieve state-of-the-art performance on graph classification, without incorporating the graph structure information in the optimization.
arXiv Detail & Related papers (2021-02-17T12:55:50Z) - SLAPS: Self-Supervision Improves Structure Learning for Graph Neural
Networks [14.319159694115655]
We propose the Simultaneous Learning of Adjacency and GNN Parameters with Self-supervision, or SLAPS, a method that provides more supervision for inferring a graph structure through self-supervision.
A comprehensive experimental study demonstrates that SLAPS scales to large graphs with hundreds of thousands of nodes and outperforms several models that have been proposed to learn a task-specific graph structure on established benchmarks.
arXiv Detail & Related papers (2021-02-09T18:56:01Z) - Directed Acyclic Graph Neural Networks [9.420935957200518]
We focus on a special, yet widely used, type of graphs -- DAGs -- and inject a stronger inductive bias -- partial ordering -- into the neural network design.
We propose the emphdirected acyclic graph relational neural network, DAGNN, an architecture that processes information according to the flow defined by the partial order.
arXiv Detail & Related papers (2021-01-20T04:50:16Z) - Hierarchical Graph Capsule Network [78.4325268572233]
We propose hierarchical graph capsule network (HGCN) that can jointly learn node embeddings and extract graph hierarchies.
To learn the hierarchical representation, HGCN characterizes the part-whole relationship between lower-level capsules (part) and higher-level capsules (whole)
arXiv Detail & Related papers (2020-12-16T04:13:26Z) - Node Similarity Preserving Graph Convolutional Networks [51.520749924844054]
Graph Neural Networks (GNNs) explore the graph structure and node features by aggregating and transforming information within node neighborhoods.
We propose SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure.
We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs.
arXiv Detail & Related papers (2020-11-19T04:18:01Z) - AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [85.0332394224503]
We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
arXiv Detail & Related papers (2020-07-05T08:16:03Z) - Cross-GCN: Enhancing Graph Convolutional Network with $k$-Order Feature
Interactions [153.6357310444093]
Graph Convolutional Network (GCN) is an emerging technique that performs learning and reasoning on graph data.
We argue that existing designs of GCN forgo modeling cross features, making GCN less effective for tasks or data where cross features are important.
We design a new operator named Cross-feature Graph Convolution, which explicitly models the arbitrary-order cross features with complexity linear to feature dimension and order size.
arXiv Detail & Related papers (2020-03-05T13:05:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.