Adversarial Directed Graph Embedding
- URL: http://arxiv.org/abs/2008.03667v3
- Date: Mon, 24 May 2021 17:22:27 GMT
- Title: Adversarial Directed Graph Embedding
- Authors: Shijie Zhu, Jianxin Li, Hao Peng, Senzhang Wang and Lifang He
- Abstract summary: We propose a novel Directed Graph embedding framework based on Generative Adversarial Network, called DGGAN.
The main idea is to use adversarial mechanisms to deploy a discriminator and two generators that jointly learn each node's source and target vectors.
Extensive experiments show that DGGAN consistently and significantly outperforms existing state-of-the-art methods across multiple graph mining tasks on directed graphs.
- Score: 43.69472660189029
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Node representation learning for directed graphs is critically important to
facilitate many graph mining tasks. To capture the directed edges between
nodes, existing methods mostly learn two embedding vectors for each node,
source vector and target vector. However, these methods learn the source and
target vectors separately. For the node with very low indegree or outdegree,
the corresponding target vector or source vector cannot be effectively learned.
In this paper, we propose a novel Directed Graph embedding framework based on
Generative Adversarial Network, called DGGAN. The main idea is to use
adversarial mechanisms to deploy a discriminator and two generators that
jointly learn each node's source and target vectors. For a given node, the two
generators are trained to generate its fake target and source neighbor nodes
from the same underlying distribution, and the discriminator aims to
distinguish whether a neighbor node is real or fake. The two generators are
formulated into a unified framework and could mutually reinforce each other to
learn more robust source and target vectors. Extensive experiments show that
DGGAN consistently and significantly outperforms existing state-of-the-art
methods across multiple graph mining tasks on directed graphs.
Related papers
- Learning on Graphs with Out-of-Distribution Nodes [33.141867473074264]
Graph Neural Networks (GNNs) are state-of-the-art models for performing prediction tasks on graphs.
This work defines the problem of graph learning with out-of-distribution nodes.
We propose Out-of-Distribution Graph Attention Network (OODGAT), a novel GNN model which explicitly models the interaction between different kinds of nodes.
arXiv Detail & Related papers (2023-08-13T08:10:23Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - GraFN: Semi-Supervised Node Classification on Graph with Few Labels via
Non-Parametric Distribution Assignment [5.879936787990759]
We propose a novel semi-supervised method for graphs, GraFN, to ensure nodes that belong to the same class to be grouped together.
GraFN randomly samples support nodes from labeled nodes and anchor nodes from the entire graph.
We experimentally show that GraFN surpasses both the semi-supervised and self-supervised methods in terms of node classification on real-world graphs.
arXiv Detail & Related papers (2022-04-04T08:22:30Z) - MGAE: Masked Autoencoders for Self-Supervised Learning on Graphs [55.66953093401889]
Masked graph autoencoder (MGAE) framework to perform effective learning on graph structure data.
Taking insights from self-supervised learning, we randomly mask a large proportion of edges and try to reconstruct these missing edges during training.
arXiv Detail & Related papers (2022-01-07T16:48:07Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Evaluating Node Embeddings of Complex Networks [0.0]
Agood embedding should capture the graph topology, node-to-node relationship, and other relevant information about the graph.
The main challenge is that one needs to make sure that embeddings describe the properties of the graphs well.
We do a series of experiments with selected graph embedding algorithms, both on real-world networks as well as artificially generated ones.
arXiv Detail & Related papers (2021-02-16T16:55:29Z) - COLOGNE: Coordinated Local Graph Neighborhood Sampling [1.6498361958317633]
replacing discrete unordered objects such as graph nodes by real-valued vectors is at the heart of many approaches to learning from graph data.
We address the problem of learning discrete node embeddings such that the coordinates of the node vector representations are graph nodes.
This opens the door to designing interpretable machine learning algorithms for graphs as all attributes originally present in the nodes are preserved.
arXiv Detail & Related papers (2021-02-09T11:39:06Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Integrating Network Embedding and Community Outlier Detection via
Multiclass Graph Description [15.679313861083239]
We propose a novel unsupervised graph embedding approach (called DMGD) which integrates outlier and community detection with node embedding.
We show the theoretical bounds on the number of outliers detected by DMGD.
Our formulation boils down to an interesting minimax game between the outliers, community assignments and the node embedding function.
arXiv Detail & Related papers (2020-07-20T16:21:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.