Multi-view graph structure learning using subspace merging on Grassmann
manifold
- URL: http://arxiv.org/abs/2204.05258v1
- Date: Mon, 11 Apr 2022 17:01:05 GMT
- Title: Multi-view graph structure learning using subspace merging on Grassmann
manifold
- Authors: Razieh Ghiasi, Hossein Amirkhani and Alireza Bosaghzadeh
- Abstract summary: We introduce a new graph structure learning approach using multi-view learning, named MV-GSL (Multi-View Graph Structure Learning)
We aggregate different graph structure learning methods using subspace merging on Grassmann manifold to improve the quality of the learned graph structures.
Our experiments show that the proposed method has promising performance compared to single and other combined graph structure learning methods.
- Score: 4.039245878626346
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Many successful learning algorithms have been recently developed to represent
graph-structured data. For example, Graph Neural Networks (GNNs) have achieved
considerable successes in various tasks such as node classification, graph
classification, and link prediction. However, these methods are highly
dependent on the quality of the input graph structure. One used approach to
alleviate this problem is to learn the graph structure instead of relying on a
manually designed graph. In this paper, we introduce a new graph structure
learning approach using multi-view learning, named MV-GSL (Multi-View Graph
Structure Learning), in which we aggregate different graph structure learning
methods using subspace merging on Grassmann manifold to improve the quality of
the learned graph structures. Extensive experiments are performed to evaluate
the effectiveness of the proposed method on two benchmark datasets, Cora and
Citeseer. Our experiments show that the proposed method has promising
performance compared to single and other combined graph structure learning
methods.
Related papers
- Knowledge Probing for Graph Representation Learning [12.960185655357495]
We propose a novel graph probing framework (GraphProbe) to investigate and interpret whether the family of graph learning methods has encoded different levels of knowledge in graph representation learning.
Based on the intrinsic properties of graphs, we design three probes to systematically investigate the graph representation learning process from different perspectives.
We construct a thorough evaluation benchmark with nine representative graph learning methods from random walk based approaches, basic graph neural networks and self-supervised graph methods, and probe them on six benchmark datasets for node classification, link prediction and graph classification.
arXiv Detail & Related papers (2024-08-07T16:27:45Z) - GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - From Cluster Assumption to Graph Convolution: Graph-based Semi-Supervised Learning Revisited [51.24526202984846]
Graph-based semi-supervised learning (GSSL) has long been a hot research topic.
graph convolutional networks (GCNs) have become the predominant techniques for their promising performance.
arXiv Detail & Related papers (2023-09-24T10:10:21Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Label-informed Graph Structure Learning for Node Classification [16.695269600529056]
We propose a novel label-informed graph structure learning framework which incorporates label information explicitly through a class transition matrix.
We conduct extensive experiments on seven node classification benchmark datasets and the results show that our method outperforms or matches the state-of-the-art baselines.
arXiv Detail & Related papers (2021-08-10T11:14:09Z) - Multiple Graph Learning for Scalable Multi-view Clustering [26.846642220480863]
We propose an efficient multiple graph learning model via a small number of anchor points and tensor Schatten p-norm minimization.
Specifically, we construct a hidden and tractable large graph by anchor graph for each view.
We develop an efficient algorithm, which scales linearly with the data size, to solve our proposed model.
arXiv Detail & Related papers (2021-06-29T13:10:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.