Wasserstein Embedding for Graph Learning
- URL: http://arxiv.org/abs/2006.09430v2
- Date: Tue, 2 Mar 2021 02:21:28 GMT
- Title: Wasserstein Embedding for Graph Learning
- Authors: Soheil Kolouri, Navid Naderializadeh, Gustavo K. Rohde, Heiko Hoffmann
- Abstract summary: Wasserstein Embedding for Graph Learning (WEGL) is a framework for embedding entire graphs in a vector space.
We leverage new insights on defining similarity between graphs as a function of the similarity between their node embedding distributions.
We evaluate our new graph embedding approach on various benchmark graph-property prediction tasks.
- Score: 33.90471037116372
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast
framework for embedding entire graphs in a vector space, in which various
machine learning models are applicable for graph-level prediction tasks. We
leverage new insights on defining similarity between graphs as a function of
the similarity between their node embedding distributions. Specifically, we use
the Wasserstein distance to measure the dissimilarity between node embeddings
of different graphs. Unlike prior work, we avoid pairwise calculation of
distances between graphs and reduce the computational complexity from quadratic
to linear in the number of graphs. WEGL calculates Monge maps from a reference
distribution to each node embedding and, based on these maps, creates a
fixed-sized vector representation of the graph. We evaluate our new graph
embedding approach on various benchmark graph-property prediction tasks,
showing state-of-the-art classification performance while having superior
computational efficiency. The code is available at
https://github.com/navid-naderi/WEGL.
Related papers
- Exploiting Edge Features in Graphs with Fused Network Gromov-Wasserstein
Distance [18.522233517515975]
We introduce an extension of Gromov-Wasserstein distance for comparing graphs whose both nodes and edges have features.
We empirically show the effectiveness of the novel distance in learning tasks where graphs occur in either input space or output space.
arXiv Detail & Related papers (2023-09-28T17:05:03Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - CGMN: A Contrastive Graph Matching Network for Self-Supervised Graph
Similarity Learning [65.1042892570989]
We propose a contrastive graph matching network (CGMN) for self-supervised graph similarity learning.
We employ two strategies, namely cross-view interaction and cross-graph interaction, for effective node representation learning.
We transform node representations into graph-level representations via pooling operations for graph similarity computation.
arXiv Detail & Related papers (2022-05-30T13:20:26Z) - Online Graph Dictionary Learning [10.394615068526505]
We propose a new online Graph Dictionary Learning approach, which uses the Gromov Wasserstein divergence for the data fitting term.
In our work, graphs are encoded through their nodes' pairwise relations and modeled as convex combination of graph atoms.
Our approach naturally extends to labeled graphs, and is completed by a novel upper bound that can be used as a fast approximation of Gromov Wasserstein in the embedding space.
arXiv Detail & Related papers (2021-02-12T14:39:28Z) - COLOGNE: Coordinated Local Graph Neighborhood Sampling [1.6498361958317633]
replacing discrete unordered objects such as graph nodes by real-valued vectors is at the heart of many approaches to learning from graph data.
We address the problem of learning discrete node embeddings such that the coordinates of the node vector representations are graph nodes.
This opens the door to designing interpretable machine learning algorithms for graphs as all attributes originally present in the nodes are preserved.
arXiv Detail & Related papers (2021-02-09T11:39:06Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Graph Partitioning and Graph Neural Network based Hierarchical Graph
Matching for Graph Similarity Computation [5.710312846460821]
Graph similarity aims to predict a similarity score between one pair of graphs to facilitate downstream applications.
We propose a graph partitioning and graph neural network-based model, called PSimGNN, to effectively resolve this issue.
PSimGNN outperforms state-of-the-art methods in graph similarity computation tasks using approximate Graph Edit Distance (GED) as the graph similarity metric.
arXiv Detail & Related papers (2020-05-16T15:01:58Z) - Wasserstein-based Graph Alignment [56.84964475441094]
We cast a new formulation for the one-to-many graph alignment problem, which aims at matching a node in the smaller graph with one or more nodes in the larger graph.
We show that our method leads to significant improvements with respect to the state-of-the-art algorithms for each of these tasks.
arXiv Detail & Related papers (2020-03-12T22:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.