Ensemble Multi-Relational Graph Neural Networks
- URL: http://arxiv.org/abs/2205.12076v1
- Date: Tue, 24 May 2022 13:52:41 GMT
- Title: Ensemble Multi-Relational Graph Neural Networks
- Authors: Yuling Wang, Hao Xu, Yanhua Yu, Mengdi Zhang, Zhenhao Li, Yuji Yang
and Wei Wu
- Abstract summary: We propose a novel ensemble multi-relational GNN by designing an ensemble multi-relational (EMR) optimization objective.
This EMR optimization objective is able to derive an iterative updating rule, which can be formalized as an ensemble message passing layer with multi-relations.
Extensive experiments conducted on four benchmark datasets well demonstrate the effectiveness of the proposed model.
- Score: 18.96097003317416
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: It is well established that graph neural networks (GNNs) can be interpreted
and designed from the perspective of optimization objective. With this clear
optimization objective, the deduced GNNs architecture has sound theoretical
foundation, which is able to flexibly remedy the weakness of GNNs. However,
this optimization objective is only proved for GNNs with single-relational
graph. Can we infer a new type of GNNs for multi-relational graphs by extending
this optimization objective, so as to simultaneously solve the issues in
previous multi-relational GNNs, e.g., over-parameterization? In this paper, we
propose a novel ensemble multi-relational GNNs by designing an ensemble
multi-relational (EMR) optimization objective. This EMR optimization objective
is able to derive an iterative updating rule, which can be formalized as an
ensemble message passing (EnMP) layer with multi-relations. We further analyze
the nice properties of EnMP layer, e.g., the relationship with multi-relational
personalized PageRank. Finally, a new multi-relational GNNs which well
alleviate the over-smoothing and over-parameterization issues are proposed.
Extensive experiments conducted on four benchmark datasets well demonstrate the
effectiveness of the proposed model.
Related papers
- Dynamically configured physics-informed neural network in topology
optimization applications [4.403140515138818]
The physics-informed neural network (PINN) can avoid generating enormous amounts of data when solving forward problems.
A dynamically configured PINN-based topology optimization (DCPINN-TO) method is proposed.
The accuracy of the displacement prediction and optimization results indicate that the DCPINN-TO method is effective and efficient.
arXiv Detail & Related papers (2023-12-12T05:35:30Z) - GNN at the Edge: Cost-Efficient Graph Neural Network Processing over
Distributed Edge Servers [24.109721494781592]
Graph Neural Networks (GNNs) are still under exploration, presenting a stark disparity to its broad edge adoptions.
This paper studies the cost optimization for distributed GNN processing over a multi-tier heterogeneous edge network.
We show that our approach achieves superior performance over de facto baselines with more than 95.8% cost eduction in a fast convergence speed.
arXiv Detail & Related papers (2022-10-31T13:03:16Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural
Networks [68.9026534589483]
RioGNN is a novel Reinforced, recursive and flexible neighborhood selection guided multi-relational Graph Neural Network architecture.
RioGNN can learn more discriminative node embedding with enhanced explainability due to the recognition of individual importance of each relation.
arXiv Detail & Related papers (2021-04-16T04:30:06Z) - Interpreting and Unifying Graph Neural Networks with An Optimization
Framework [47.44773358082203]
Graph Neural Networks (GNNs) have received considerable attention on graph-structured data learning.
In this paper, we establish a surprising connection between different propagation mechanisms with a unified optimization problem.
Our proposed unified optimization framework, summarizing the commonalities between several of the most representative GNNs, opens up new opportunities for flexibly designing new GNNs.
arXiv Detail & Related papers (2021-01-28T08:06:02Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z) - Bayesian Graph Neural Networks with Adaptive Connection Sampling [62.51689735630133]
We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs)
The proposed framework not only alleviates over-smoothing and over-fitting tendencies of deep GNNs, but also enables learning with uncertainty in graph analytic tasks with GNNs.
arXiv Detail & Related papers (2020-06-07T07:06:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.