Synergistic Graph Fusion via Encoder Embedding
- URL: http://arxiv.org/abs/2303.18051v4
- Date: Wed, 5 Jun 2024 09:26:44 GMT
- Title: Synergistic Graph Fusion via Encoder Embedding
- Authors: Cencheng Shen, Carey E. Priebe, Jonathan Larson, Ha Trinh,
- Abstract summary: We introduce a method called graph fusion embedding, designed for multi-graph embedding with shared sets.
Under the framework of supervised learning, our method exhibits a remarkable and highly desirable synergistic effect.
Our comprehensive simulations and real data experiments provide compelling evidence supporting the effectiveness of our proposed method.
- Score: 16.20934021488478
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we introduce a method called graph fusion embedding, designed for multi-graph embedding with shared vertex sets. Under the framework of supervised learning, our method exhibits a remarkable and highly desirable synergistic effect: for sufficiently large vertex size, the accuracy of vertex classification consistently benefits from the incorporation of additional graphs. We establish the mathematical foundation for the method, including the asymptotic convergence of the embedding, a sufficient condition for asymptotic optimal classification, and the proof of the synergistic effect for vertex classification. Our comprehensive simulations and real data experiments provide compelling evidence supporting the effectiveness of our proposed method, showcasing the pronounced synergistic effect for multiple graphs from disparate sources.
Related papers
- Randomized Schur Complement Views for Graph Contrastive Learning [0.0]
We introduce a randomized topological augmentor based on Schur complements for Graph Contrastive Learning (GCL)
Given a graph laplacian matrix, the technique generates unbiased approximations of its Schur complements and treats the corresponding graphs as augmented views.
arXiv Detail & Related papers (2023-06-06T20:35:20Z) - Laplacian-based Semi-Supervised Learning in Multilayer Hypergraphs by
Coordinate Descent [4.56754610152086]
Graph Semi-Supervised learning is an important data analysis tool.
In this paper, we consider an optimization-based formulation of the problem for an undirected graph.
We solve the problem using different coordinate descent approaches and compare the results with the ones obtained by the classic gradient descent method.
arXiv Detail & Related papers (2023-01-28T12:59:07Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Data-heterogeneity-aware Mixing for Decentralized Learning [63.83913592085953]
We characterize the dependence of convergence on the relationship between the mixing weights of the graph and the data heterogeneity across nodes.
We propose a metric that quantifies the ability of a graph to mix the current gradients.
Motivated by our analysis, we propose an approach that periodically and efficiently optimize the metric.
arXiv Detail & Related papers (2022-04-13T15:54:35Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Convergent Boosted Smoothing for Modeling Graph Data with Tabular Node
Features [46.052312251801]
We propose a framework for iterating boosting with graph propagation steps.
Our approach is anchored in a principled meta loss function.
Across a variety of non-iid graph datasets, our method achieves comparable or superior performance.
arXiv Detail & Related papers (2021-10-26T04:53:12Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Multilayer Graph Clustering with Optimized Node Embedding [70.1053472751897]
multilayer graph clustering aims at dividing the graph nodes into categories or communities.
We propose a clustering-friendly embedding of the layers of a given multilayer graph.
Experiments show that our method leads to a significant improvement.
arXiv Detail & Related papers (2021-03-30T17:36:40Z) - Fusion Moves for Graph Matching [35.27002115682325]
We contribute to approximate algorithms for the quadratic assignment problem also known as graph matching.
Inspired by the success of the fusion moves technique developed for multilabel discrete Markov random fields, we investigate its applicability to graph matching.
We show how it can be efficiently combined with the dedicated state-of-the-art Lagrange dual methods.
arXiv Detail & Related papers (2021-01-28T16:09:46Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.