Learning Attributed Graphlets: Predictive Graph Mining by Graphlets with
Trainable Attribute
- URL: http://arxiv.org/abs/2402.06932v1
- Date: Sat, 10 Feb 2024 12:10:13 GMT
- Title: Learning Attributed Graphlets: Predictive Graph Mining by Graphlets with
Trainable Attribute
- Authors: Tajima Shinji, Ren Sugihara, Ryota Kitahara and Masayuki Karasuyama
- Abstract summary: This paper proposes an interpretable classification algorithm for attributed graph data, called LAGRA (Learning Attributed GRAphlets)
LAGRA learns importance weights for small attributed subgraphs, called attributed graphlets (AGs), while simultaneously optimizing their attribute vectors.
We empirically demonstrate that LAGRA has superior or comparable prediction performance to the standard existing algorithms.
- Score: 4.14034448023832
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The graph classification problem has been widely studied; however, achieving
an interpretable model with high predictive performance remains a challenging
issue. This paper proposes an interpretable classification algorithm for
attributed graph data, called LAGRA (Learning Attributed GRAphlets). LAGRA
learns importance weights for small attributed subgraphs, called attributed
graphlets (AGs), while simultaneously optimizing their attribute vectors. This
enables us to obtain a combination of subgraph structures and their attribute
vectors that strongly contribute to discriminating different classes. A
significant characteristics of LAGRA is that all the subgraph structures in the
training dataset can be considered as a candidate structures of AGs. This
approach can explore all the potentially important subgraphs exhaustively, but
obviously, a naive implementation can require a large amount of computations.
To mitigate this issue, we propose an efficient pruning strategy by combining
the proximal gradient descent and a graph mining tree search. Our pruning
strategy can ensure that the quality of the solution is maintained compared to
the result without pruning. We empirically demonstrate that LAGRA has superior
or comparable prediction performance to the standard existing algorithms
including graph neural networks, while using only a small number of AGs in an
interpretable manner.
Related papers
- Imbalanced Graph Classification with Multi-scale Oversampling Graph Neural Networks [25.12261412297796]
We introduce a novel multi-scale oversampling graph neural network (MOSGNN) that learns expressive minority graph representations.
It achieves this by jointly optimizing subgraph-level, graph-level, and pairwise-graph learning tasks.
Experiments on 16 imbalanced graph datasets show that MOSGNN i) significantly outperforms five state-of-the-art models.
arXiv Detail & Related papers (2024-05-08T09:16:54Z) - SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - The Graph Lottery Ticket Hypothesis: Finding Sparse, Informative Graph
Structure [18.00833762891405]
Graph Lottery Ticket (GLT) Hypothesis: There is an extremely sparse backbone for every graph.
We study 8 key metrics of interest that directly influence the performance of graph learning algorithms.
We propose a straightforward and efficient algorithm for finding these GLTs in arbitrary graphs.
arXiv Detail & Related papers (2023-12-08T00:24:44Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - Stochastic Subgraph Neighborhood Pooling for Subgraph Classification [2.1270496914042996]
Subgraph Neighborhood Pooling (SSNP) jointly aggregates the subgraph and its neighborhood information without any computationally expensive operations such as labeling tricks.
Our experiments demonstrate that our models outperform current state-of-the-art methods (with a margin of up to 2%) while being up to 3X faster in training.
arXiv Detail & Related papers (2023-04-17T18:49:18Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Multi-Level Graph Contrastive Learning [38.022118893733804]
We propose a Multi-Level Graph Contrastive Learning (MLGCL) framework for learning robust representation of graph data by contrasting space views of graphs.
The original graph is first-order approximation structure and contains uncertainty or error, while the $k$NN graph generated by encoding features preserves high-order proximity.
Extensive experiments indicate MLGCL achieves promising results compared with the existing state-of-the-art graph representation learning methods on seven datasets.
arXiv Detail & Related papers (2021-07-06T14:24:43Z) - Graph Information Bottleneck for Subgraph Recognition [103.37499715761784]
We propose a framework of Graph Information Bottleneck (GIB) for the subgraph recognition problem in deep graph learning.
Under this framework, one can recognize the maximally informative yet compressive subgraph, named IB-subgraph.
We evaluate the properties of the IB-subgraph in three application scenarios: improvement of graph classification, graph interpretation and graph denoising.
arXiv Detail & Related papers (2020-10-12T09:32:20Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Unsupervised Graph Embedding via Adaptive Graph Learning [85.28555417981063]
Graph autoencoders (GAEs) are powerful tools in representation learning for graph embedding.
In this paper, two novel unsupervised graph embedding methods, unsupervised graph embedding via adaptive graph learning (BAGE) and unsupervised graph embedding via variational adaptive graph learning (VBAGE) are proposed.
Experimental studies on several datasets validate our design and demonstrate that our methods outperform baselines by a wide margin in node clustering, node classification, and graph visualization tasks.
arXiv Detail & Related papers (2020-03-10T02:33:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.