A Gromov--Wasserstein Geometric View of Spectrum-Preserving Graph
Coarsening
- URL: http://arxiv.org/abs/2306.08854v1
- Date: Thu, 15 Jun 2023 04:47:26 GMT
- Title: A Gromov--Wasserstein Geometric View of Spectrum-Preserving Graph
Coarsening
- Authors: Yifan Chen, Rentian Yao, Yun Yang, Jie Chen
- Abstract summary: This work studies graph coarsening from a different perspective, developing a theory for preserving graph distances.
The geometric approach is useful when working with a collection of graphs, such as in graph classification and regression.
Minimizing this difference can be done using the popular weighted kernel $K$-means method.
- Score: 19.09507367362567
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph coarsening is a technique for solving large-scale graph problems by
working on a smaller version of the original graph, and possibly interpolating
the results back to the original graph. It has a long history in scientific
computing and has recently gained popularity in machine learning, particularly
in methods that preserve the graph spectrum. This work studies graph coarsening
from a different perspective, developing a theory for preserving graph
distances and proposing a method to achieve this. The geometric approach is
useful when working with a collection of graphs, such as in graph
classification and regression. In this study, we consider a graph as an element
on a metric space equipped with the Gromov--Wasserstein (GW) distance, and
bound the difference between the distance of two graphs and their coarsened
versions. Minimizing this difference can be done using the popular weighted
kernel $K$-means method, which improves existing spectrum-preserving methods
with the proper choice of the kernel. The study includes a set of experiments
to support the theory and method, including approximating the GW distance,
preserving the graph spectrum, classifying graphs using spectral information,
and performing regression using graph convolutional networks. Code is available
at https://github.com/ychen-stat-ml/GW-Graph-Coarsening .
Related papers
- GSINA: Improving Subgraph Extraction for Graph Invariant Learning via
Graph Sinkhorn Attention [52.67633391931959]
Graph invariant learning (GIL) has been an effective approach to discovering the invariant relationships between graph data and its labels.
We propose a novel graph attention mechanism called Graph Sinkhorn Attention (GSINA)
GSINA is able to obtain meaningful, differentiable invariant subgraphs with controllable sparsity and softness.
arXiv Detail & Related papers (2024-02-11T12:57:16Z) - FoSR: First-order spectral rewiring for addressing oversquashing in GNNs [0.0]
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph.
We propose a computationally efficient algorithm that prevents oversquashing by systematically adding edges to the graph.
We find experimentally that our algorithm outperforms existing graph rewiring methods in several graph classification tasks.
arXiv Detail & Related papers (2022-10-21T07:58:03Z) - A Unified Framework for Optimization-Based Graph Coarsening [5.720402020129441]
Given a large graph, graph coarsening aims to learn a smaller-tractable graph while preserving the properties of the originally given graph.
The proposed framework lies in the unification of graph learning and dimensionality reduction.
It is established that the learned coarsened graph is $epsin(0,1)$ similar to the original graph.
arXiv Detail & Related papers (2022-10-02T06:31:42Z) - Scaling R-GCN Training with Graph Summarization [71.06855946732296]
Training of Relation Graph Convolutional Networks (R-GCN) does not scale well with the size of the graph.
In this work, we experiment with the use of graph summarization techniques to compress the graph.
We obtain reasonable results on the AIFB, MUTAG and AM datasets.
arXiv Detail & Related papers (2022-03-05T00:28:43Z) - Graph coarsening: From scientific computing to machine learning [11.728753892489776]
The goal of this paper is to take a broad look into coarsening techniques that have been successfully deployed in scientific computing.
In machine learning, graph coarsening goes under various names, e.g., graph downsampling or graph reduction.
As will be seen, a common strategy in these methods is to rely on spectral properties to define the coarse graph.
arXiv Detail & Related papers (2021-06-22T15:31:50Z) - Graph Coarsening with Neural Networks [8.407217618651536]
We propose a framework for measuring the quality of coarsening algorithm and show that depending on the goal, we need to carefully choose the Laplace operator on the coarse graph.
Motivated by the observation that the current choice of edge weight for the coarse graph may be sub-optimal, we parametrize the weight assignment map with graph neural networks and train it to improve the coarsening quality in an unsupervised way.
arXiv Detail & Related papers (2021-02-02T06:50:07Z) - Some Algorithms on Exact, Approximate and Error-Tolerant Graph Matching [3.655021726150369]
We present an extensive survey of various exact and inexact graph matching techniques.
A category of graph matching algorithms is presented, which reduces the graph size by removing the less important nodes.
We introduce a novel approach to measure graph similarity using geometric graphs.
arXiv Detail & Related papers (2020-12-30T18:51:06Z) - Learning Graphons via Structured Gromov-Wasserstein Barycenters [143.42601038462965]
We propose a novel and principled method to learn a nonparametric graph model called graphon.
The proposed approach overcomes drawbacks of prior state-of-the-art methods, and outperforms them on both synthetic and real-world data.
arXiv Detail & Related papers (2020-12-10T13:04:29Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Wasserstein-based Graph Alignment [56.84964475441094]
We cast a new formulation for the one-to-many graph alignment problem, which aims at matching a node in the smaller graph with one or more nodes in the larger graph.
We show that our method leads to significant improvements with respect to the state-of-the-art algorithms for each of these tasks.
arXiv Detail & Related papers (2020-03-12T22:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.