A parameterised model for link prediction using node centrality and
similarity measure based on graph embedding
- URL: http://arxiv.org/abs/2309.05434v1
- Date: Mon, 11 Sep 2023 13:13:54 GMT
- Title: A parameterised model for link prediction using node centrality and
similarity measure based on graph embedding
- Authors: Haohui Lu and Shahadat Uddin
- Abstract summary: Link prediction is a key aspect of graph machine learning.
It involves predicting new links that may form between network nodes.
Existing models have significant shortcomings.
We present the Node Centrality and Similarity Based.
Model (NCSM), a novel method for link prediction tasks.
- Score: 5.507008181141738
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Link prediction is a key aspect of graph machine learning, with applications
as diverse as disease prediction, social network recommendations, and drug
discovery. It involves predicting new links that may form between network
nodes. Despite the clear importance of link prediction, existing models have
significant shortcomings. Graph Convolutional Networks, for instance, have been
proven to be highly efficient for link prediction on a variety of datasets.
However, they encounter severe limitations when applied to short-path networks
and ego networks, resulting in poor performance. This presents a critical
problem space that this work aims to address. In this paper, we present the
Node Centrality and Similarity Based Parameterised Model (NCSM), a novel method
for link prediction tasks. NCSM uniquely integrates node centrality and
similarity measures as edge features in a customised Graph Neural Network (GNN)
layer, effectively leveraging the topological information of large networks.
This model represents the first parameterised GNN-based link prediction model
that considers topological information. The proposed model was evaluated on
five benchmark graph datasets, each comprising thousands of nodes and edges.
Experimental results highlight NCSM's superiority over existing
state-of-the-art models like Graph Convolutional Networks and Variational Graph
Autoencoder, as it outperforms them across various metrics and datasets. This
exceptional performance can be attributed to NCSM's innovative integration of
node centrality, similarity measures, and its efficient use of topological
information.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Network Intrusion Detection with Edge-Directed Graph Multi-Head Attention Networks [13.446986347747325]
This paper proposes novel Edge-Directed Graph Multi-Head Attention Networks (EDGMAT) for network intrusion detection.
The proposed EDGMAT model introduces a multi-head attention mechanism into the intrusion detection model. Additional weight learning is realized through the combination of a multi-head attention mechanism and edge features.
arXiv Detail & Related papers (2023-10-26T12:30:11Z) - Disentangling Node Attributes from Graph Topology for Improved
Generalizability in Link Prediction [5.651457382936249]
Our proposed method, UPNA, solves the inductive link prediction problem by learning a function that takes a pair of node attributes and predicts the probability of an edge.
UPNA can be applied to various pairwise learning tasks and integrated with existing link prediction models to enhance their generalizability and bolster graph generative models.
arXiv Detail & Related papers (2023-07-17T22:19:12Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Mutually exciting point process graphs for modelling dynamic networks [0.0]
A new class of models for dynamic networks is proposed, called mutually exciting point process graphs (MEG)
MEG is a scalable network-wide statistical model for point processes with dyadic marks, which can be used for anomaly detection.
The model is tested on simulated graphs and real world computer network datasets, demonstrating excellent performance.
arXiv Detail & Related papers (2021-02-11T10:14:55Z) - GAIN: Graph Attention & Interaction Network for Inductive
Semi-Supervised Learning over Large-scale Graphs [18.23435958000212]
Graph Neural Networks (GNNs) have led to state-of-the-art performance on a variety of machine learning tasks such as recommendation, node classification and link prediction.
Most existing GNN models exploit a single type of aggregator to aggregate neighboring nodes information.
We propose a novel graph neural network architecture, Graph Attention & Interaction Network (GAIN), for inductive learning on graphs.
arXiv Detail & Related papers (2020-11-03T00:20:24Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z) - Optimal Transport Graph Neural Networks [31.191844909335963]
Current graph neural network (GNN) architectures naively average or sum node embeddings into an aggregated graph representation.
We introduce OT-GNN, a model that computes graph embeddings using parametric prototypes.
arXiv Detail & Related papers (2020-06-08T14:57:39Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.