Bures-Wasserstein Means of Graphs
- URL: http://arxiv.org/abs/2305.19738v2
- Date: Fri, 1 Mar 2024 17:45:50 GMT
- Title: Bures-Wasserstein Means of Graphs
- Authors: Isabel Haasler, Pascal Frossard
- Abstract summary: We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
- Score: 60.42414991820453
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Finding the mean of sampled data is a fundamental task in machine learning
and statistics. However, in cases where the data samples are graph objects,
defining a mean is an inherently difficult task. We propose a novel framework
for defining a graph mean via embeddings in the space of smooth graph signal
distributions, where graph similarity can be measured using the Wasserstein
metric. By finding a mean in this embedding space, we can recover a mean graph
that preserves structural information. We establish the existence and
uniqueness of the novel graph mean, and provide an iterative algorithm for
computing it. To highlight the potential of our framework as a valuable tool
for practical applications in machine learning, it is evaluated on various
tasks, including k-means clustering of structured aligned graphs,
classification of functional brain networks, and semi-supervised node
classification in multi-layer graphs. Our experimental results demonstrate that
our approach achieves consistent performance, outperforms existing baseline
approaches, and improves the performance of state-of-the-art methods.
Related papers
- TopER: Topological Embeddings in Graph Representation Learning [8.052380377159398]
Topological Evolution Rate (TopER) is a low-dimensional embedding approach grounded in topological data analysis.
TopER simplifies a key topological approach, Persistent Homology, by calculating the evolution rate of graph substructures.
Our models achieve or surpass state-of-the-art results across molecular, biological, and social network datasets in tasks such as classification, clustering, and visualization.
arXiv Detail & Related papers (2024-10-02T17:31:33Z) - Knowledge Probing for Graph Representation Learning [12.960185655357495]
We propose a novel graph probing framework (GraphProbe) to investigate and interpret whether the family of graph learning methods has encoded different levels of knowledge in graph representation learning.
Based on the intrinsic properties of graphs, we design three probes to systematically investigate the graph representation learning process from different perspectives.
We construct a thorough evaluation benchmark with nine representative graph learning methods from random walk based approaches, basic graph neural networks and self-supervised graph methods, and probe them on six benchmark datasets for node classification, link prediction and graph classification.
arXiv Detail & Related papers (2024-08-07T16:27:45Z) - The Graph Lottery Ticket Hypothesis: Finding Sparse, Informative Graph
Structure [18.00833762891405]
Graph Lottery Ticket (GLT) Hypothesis: There is an extremely sparse backbone for every graph.
We study 8 key metrics of interest that directly influence the performance of graph learning algorithms.
We propose a straightforward and efficient algorithm for finding these GLTs in arbitrary graphs.
arXiv Detail & Related papers (2023-12-08T00:24:44Z) - A Multi-scale Graph Signature for Persistence Diagrams based on Return
Probabilities of Random Walks [1.745838188269503]
We explore the use of a family of multi-scale graph signatures to enhance the robustness of topological features.
We propose a deep learning architecture to handle this set input.
Experiments on benchmark graph classification datasets demonstrate that our proposed architecture outperforms other persistent homology-based methods.
arXiv Detail & Related papers (2022-09-28T17:30:27Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Structure-Aware Hierarchical Graph Pooling using Information Bottleneck [2.7088996845250897]
Graph pooling is an essential ingredient of Graph Neural Networks (GNNs) in graph classification and regression tasks.
We propose a novel pooling method named as HIBPool where we leverage the Information Bottleneck (IB) principle.
We also introduce a novel structure-aware Discriminative Pooling Readout (DiP-Readout) function to capture the informative local subgraph structures in the graph.
arXiv Detail & Related papers (2021-04-27T07:27:43Z) - GraphOpt: Learning Optimization Models of Graph Formation [72.75384705298303]
We propose an end-to-end framework that learns an implicit model of graph structure formation and discovers an underlying optimization mechanism.
The learned objective can serve as an explanation for the observed graph properties, thereby lending itself to transfer across different graphs within a domain.
GraphOpt poses link formation in graphs as a sequential decision-making process and solves it using maximum entropy inverse reinforcement learning algorithm.
arXiv Detail & Related papers (2020-07-07T16:51:39Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.