A Causal Disentangled Multi-Granularity Graph Classification Method
- URL: http://arxiv.org/abs/2310.16256v1
- Date: Wed, 25 Oct 2023 00:20:50 GMT
- Title: A Causal Disentangled Multi-Granularity Graph Classification Method
- Authors: Yuan Li, Li Liu, Penggang Chen, Youmin Zhang, Guoyin Wang
- Abstract summary: Some graph classification methods do not combine the multi-granularity characteristics of graph data.
This paper proposes a causal disentangled multi-granularity graph representation learning method (CDM-GNN) to solve this challenge.
The model exhibits strong classification performance and generates explanatory outcomes aligning with human cognitive patterns.
- Score: 18.15154299104419
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph data widely exists in real life, with large amounts of data and complex
structures. It is necessary to map graph data to low-dimensional embedding.
Graph classification, a critical graph task, mainly relies on identifying the
important substructures within the graph. At present, some graph classification
methods do not combine the multi-granularity characteristics of graph data.
This lack of granularity distinction in modeling leads to a conflation of key
information and false correlations within the model. So, achieving the desired
goal of a credible and interpretable model becomes challenging. This paper
proposes a causal disentangled multi-granularity graph representation learning
method (CDM-GNN) to solve this challenge. The CDM-GNN model disentangles the
important substructures and bias parts within the graph from a
multi-granularity perspective. The disentanglement of the CDM-GNN model reveals
important and bias parts, forming the foundation for its classification task,
specifically, model interpretations. The CDM-GNN model exhibits strong
classification performance and generates explanatory outcomes aligning with
human cognitive patterns. In order to verify the effectiveness of the model,
this paper compares the three real-world datasets MUTAG, PTC, and IMDM-M. Six
state-of-the-art models, namely GCN, GAT, Top-k, ASAPool, SUGAR, and SAT are
employed for comparison purposes. Additionally, a qualitative analysis of the
interpretation results is conducted.
Related papers
- Neural Graph Pattern Machine [50.78679002846741]
We propose the Neural Graph Pattern Machine (GPM), a framework designed to learn directly from graph patterns.
GPM efficiently extracts and encodes substructures while identifying the most relevant ones for downstream tasks.
arXiv Detail & Related papers (2025-01-30T20:37:47Z) - Revisiting Graph Neural Networks on Graph-level Tasks: Comprehensive Experiments, Analysis, and Improvements [54.006506479865344]
We propose a unified evaluation framework for graph-level Graph Neural Networks (GNNs)
This framework provides a standardized setting to evaluate GNNs across diverse datasets.
We also propose a novel GNN model with enhanced expressivity and generalization capabilities.
arXiv Detail & Related papers (2025-01-01T08:48:53Z) - Towards Data-centric Machine Learning on Directed Graphs: a Survey [23.498557237805414]
We introduce a novel taxonomy for existing studies of directed graph learning.
We re-examine these methods from the data-centric perspective, with an emphasis on understanding and improving data representation.
We identify key opportunities and challenges within the field, offering insights that can guide future research and development in directed graph learning.
arXiv Detail & Related papers (2024-11-28T06:09:12Z) - Conditional Distribution Learning on Graphs [15.730933577970687]
We propose a conditional distribution learning (CDL) method that learns graph representations from graph-structured data for semisupervised graph classification.
Specifically, we present an end-to-end graph representation learning model to align the conditional distributions of weakly and strongly augmented features over the original features.
arXiv Detail & Related papers (2024-11-20T07:26:36Z) - Introducing Diminutive Causal Structure into Graph Representation Learning [19.132025125620274]
We introduce a novel method that enables Graph Neural Networks (GNNs) to glean insights from specialized diminutive causal structures.
Our method specifically extracts causal knowledge from the model representation of these diminutive causal structures.
arXiv Detail & Related papers (2024-06-13T00:18:20Z) - Generating the Graph Gestalt: Kernel-Regularized Graph Representation
Learning [47.506013386710954]
A complete scientific understanding of graph data should address both global and local structure.
We propose a joint model for both as complementary objectives in a graph VAE framework.
Our experiments demonstrate a significant improvement in the realism of the generated graph structures, typically by 1-2 orders of magnitude of graph structure metrics.
arXiv Detail & Related papers (2021-06-29T10:48:28Z) - A Deep Latent Space Model for Graph Representation Learning [10.914558012458425]
We propose a Deep Latent Space Model (DLSM) for directed graphs to incorporate the traditional latent variable based generative model into deep learning frameworks.
Our proposed model consists of a graph convolutional network (GCN) encoder and a decoder, which are layer-wise connected by a hierarchical variational auto-encoder architecture.
Experiments on real-world datasets show that the proposed model achieves the state-of-the-art performances on both link prediction and community detection tasks.
arXiv Detail & Related papers (2021-06-22T12:41:19Z) - Graph Classification by Mixture of Diverse Experts [67.33716357951235]
We present GraphDIVE, a framework leveraging mixture of diverse experts for imbalanced graph classification.
With a divide-and-conquer principle, GraphDIVE employs a gating network to partition an imbalanced graph dataset into several subsets.
Experiments on real-world imbalanced graph datasets demonstrate the effectiveness of GraphDIVE.
arXiv Detail & Related papers (2021-03-29T14:03:03Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.