Directed Graph Representation through Vector Cross Product
- URL: http://arxiv.org/abs/2010.10737v1
- Date: Wed, 21 Oct 2020 03:17:44 GMT
- Title: Directed Graph Representation through Vector Cross Product
- Authors: Ramanujam Madhavan, Mohit Wadhwa
- Abstract summary: Graph embedding methods embed the nodes in a graph in low dimensional vector space while preserving graph topology.
Recent work on directed graphs proposed to preserve the direction of edges among nodes by learning two embeddings, source and target, for every node.
We propose a novel approach that takes advantage of the non commutative property of vector cross product to learn embeddings that inherently preserve the direction of edges among nodes.
- Score: 2.398608007786179
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph embedding methods embed the nodes in a graph in low dimensional vector
space while preserving graph topology to carry out the downstream tasks such as
link prediction, node recommendation and clustering. These tasks depend on a
similarity measure such as cosine similarity and Euclidean distance between a
pair of embeddings that are symmetric in nature and hence do not hold good for
directed graphs. Recent work on directed graphs, HOPE, APP, and NERD, proposed
to preserve the direction of edges among nodes by learning two embeddings,
source and target, for every node. However, these methods do not take into
account the properties of directed edges explicitly. To understand the
directional relation among nodes, we propose a novel approach that takes
advantage of the non commutative property of vector cross product to learn
embeddings that inherently preserve the direction of edges among nodes. We
learn the node embeddings through a Siamese neural network where the
cross-product operation is incorporated into the network architecture. Although
cross product between a pair of vectors is defined in three dimensional, the
approach is extended to learn N dimensional embeddings while maintaining the
non-commutative property. In our empirical experiments on three real-world
datasets, we observed that even very low dimensional embeddings could
effectively preserve the directional property while outperforming some of the
state-of-the-art methods on link prediction and node recommendation tasks
Related papers
- Improving Graph Neural Networks by Learning Continuous Edge Directions [0.0]
Graph Neural Networks (GNNs) traditionally employ a message-passing mechanism that resembles diffusion over undirected graphs.
Our key insight is to assign fuzzy edge directions to the edges of a graph so that features can preferentially flow in one direction between nodes.
We propose a general framework, called Continuous Edge Direction (CoED) GNN, for learning on graphs with fuzzy edges.
arXiv Detail & Related papers (2024-10-18T01:34:35Z) - Reliable Node Similarity Matrix Guided Contrastive Graph Clustering [51.23437296378319]
We introduce a new framework, Reliable Node Similarity Matrix Guided Contrastive Graph Clustering (NS4GC)
Our method introduces node-neighbor alignment and semantic-aware sparsification, ensuring the node similarity matrix is both accurate and efficiently sparse.
arXiv Detail & Related papers (2024-08-07T13:36:03Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Graph Transformer GANs for Graph-Constrained House Generation [223.739067413952]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The GTGAN learns effective graph node relations in an end-to-end fashion for the challenging graph-constrained house generation task.
arXiv Detail & Related papers (2023-03-14T20:35:45Z) - PA-GM: Position-Aware Learning of Embedding Networks for Deep Graph
Matching [14.713628231555223]
We introduce a novel end-to-end neural network that can map the linear assignment problem into a high-dimensional space.
Our model constructs the anchor set for the relative position of nodes.
It then aggregates the feature information of the target node and each anchor node based on a measure of relative position.
arXiv Detail & Related papers (2023-01-05T06:54:21Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Pseudo-Euclidean Attract-Repel Embeddings for Undirected Graphs [73.0261182389643]
Dot product embeddings take a graph and construct vectors for nodes such that dot products between two vectors give the strength of the edge.
We remove the transitivity assumption by embedding nodes into a pseudo-Euclidean space.
Pseudo-Euclidean embeddings can compress networks efficiently, allow for multiple notions of nearest neighbors each with their own interpretation, and can be slotted' into existing models.
arXiv Detail & Related papers (2021-06-17T17:23:56Z) - node2coords: Graph Representation Learning with Wasserstein Barycenters [59.07120857271367]
We introduce node2coords, a representation learning algorithm for graphs.
It learns simultaneously a low-dimensional space and coordinates for the nodes in that space.
Experimental results demonstrate that the representations learned with node2coords are interpretable.
arXiv Detail & Related papers (2020-07-31T13:14:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.