Edgeless-GNN: Unsupervised Inductive Edgeless Network Embedding
- URL: http://arxiv.org/abs/2104.05225v1
- Date: Mon, 12 Apr 2021 06:37:31 GMT
- Title: Edgeless-GNN: Unsupervised Inductive Edgeless Network Embedding
- Authors: Yong-Min Shin, Cong Tran, Won-Yong Shin, Xin Cao
- Abstract summary: We study the problem of embedding edgeless nodes such as users who newly enter the underlying network.
We propose Edgeless-GNN, a new framework that enables GNNs to generate node embeddings even for edgeless nodes through unsupervised inductive learning.
- Score: 7.391641422048645
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study the problem of embedding edgeless nodes such as users who newly
enter the underlying network, while using graph neural networks (GNNs) widely
studied for effective representation learning of graphs thanks to its highly
expressive capability via message passing. Our study is motivated by the fact
that GNNs cannot be adopted for our problem since message passing to such
edgeless nodes having no connections is impossible. To tackle this challenge,
we propose Edgeless-GNN, a new framework that enables GNNs to generate node
embeddings even for edgeless nodes through unsupervised inductive learning.
Specifically, we utilize a $k$-nearest neighbor graph ($k$NNG) based on the
similarity of node attributes to replace the GNN's computation graph defined by
the neighborhood-based aggregation of each node. The known network structure is
used to train model parameters, whereas a loss function is established in such
a way that our model learns the network structure. For the edgeless nodes, we
inductively infer embeddings by using edges via $k$NNG construction as a
computation graph. By evaluating the performance of various downstream machine
learning (ML) tasks, we empirically demonstrate that Edgeless-GNN consistently
outperforms state-of-the-art methods of inductive network embedding. Our
framework is GNN-model-agnostic; thus, GNN models can be appropriately chosen
according to ones' needs and ML tasks.
Related papers
- Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Graph Coordinates and Conventional Neural Networks -- An Alternative for
Graph Neural Networks [0.10923877073891444]
We propose Topology Coordinate Neural Network (TCNN) and Directional Virtual Coordinate Neural Network (DVCNN) as novel alternatives to message passing GNNs.
TCNN and DVCNN achieve competitive or superior performance to message passing GNNs.
Our work expands the toolbox of techniques for graph-based machine learning.
arXiv Detail & Related papers (2023-12-03T10:14:10Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - Superiority of GNN over NN in generalizing bandlimited functions [6.3151583550712065]
Graph Neural Networks (GNNs) have emerged as formidable resources for processing graph-based information across diverse applications.
In this study, we investigate the proficiency of GNNs for such classifications, which can also be cast as a function problem.
Our findings highlight a pronounced efficiency in utilizing GNNs to generalize a bandlimited function within an $varepsilon$-error margin.
arXiv Detail & Related papers (2022-06-13T05:15:12Z) - AdaGNN: A multi-modal latent representation meta-learner for GNNs based
on AdaBoosting [0.38073142980733]
Graph Neural Networks (GNNs) focus on extracting intrinsic network features.
We propose boosting-based meta learner for GNNs.
AdaGNN performs exceptionally well for applications with rich and diverse node neighborhood information.
arXiv Detail & Related papers (2021-08-14T03:07:26Z) - Learnt Sparsification for Interpretable Graph Neural Networks [5.527927312898106]
We propose a novel method called Kedge for explicitly sparsifying the underlying graph by removing unnecessary neighbors.
Kedge learns edge masks in a modular fashion trained with any GNN allowing for gradient based optimization.
We show that Kedge effectively counters the over-smoothing phenomena in deep GNNs by maintaining good task performance with increasing GNN layers.
arXiv Detail & Related papers (2021-06-23T16:04:25Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.