Differentially Describing Groups of Graphs
- URL: http://arxiv.org/abs/2201.04064v1
- Date: Thu, 16 Dec 2021 15:18:24 GMT
- Title: Differentially Describing Groups of Graphs
- Authors: Corinna Coupette, Sebastian Dalleiger, and Jilles Vreeken
- Abstract summary: We refer to this task as graph group analysis, which seeks to describe similarities and differences between graph groups by means of statistically significant subgraphs.
We introduce Gragra, which uses maximum entropy modeling to identify a non-redundant set of subgraphs with statistically significant associations to one or more graph groups.
- Score: 26.218670461973705
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: How does neural connectivity in autistic children differ from neural
connectivity in healthy children or autistic youths? What patterns in global
trade networks are shared across classes of goods, and how do these patterns
change over time? Answering questions like these requires us to differentially
describe groups of graphs: Given a set of graphs and a partition of these
graphs into groups, discover what graphs in one group have in common, how they
systematically differ from graphs in other groups, and how multiple groups of
graphs are related. We refer to this task as graph group analysis, which seeks
to describe similarities and differences between graph groups by means of
statistically significant subgraphs. To perform graph group analysis, we
introduce Gragra, which uses maximum entropy modeling to identify a
non-redundant set of subgraphs with statistically significant associations to
one or more graph groups. Through an extensive set of experiments on a wide
range of synthetic and real-world graph groups, we confirm that Gragra works
well in practice.
Related papers
- Multi-Scale Subgraph Contrastive Learning [9.972544118719572]
We propose a multi-scale subgraph contrastive learning architecture which is able to characterize the fine-grained semantic information.
Specifically, we generate global and local views at different scales based on subgraph sampling, and construct multiple contrastive relationships according to their semantic associations.
arXiv Detail & Related papers (2024-03-05T07:17:18Z) - Deep Graph-Level Clustering Using Pseudo-Label-Guided Mutual Information
Maximization Network [31.38584638254226]
We study the problem of partitioning a set of graphs into different groups such that the graphs in the same group are similar while the graphs in different groups are dissimilar.
To solve the problem, we propose a novel method called Deep Graph-Level Clustering (DGLC)
Our DGLC achieves graph-level representation learning and graph-level clustering in an end-to-end manner.
arXiv Detail & Related papers (2023-02-05T12:28:08Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Semi-Supervised Hierarchical Graph Classification [54.25165160435073]
We study the node classification problem in the hierarchical graph where a 'node' is a graph instance.
We propose the Hierarchical Graph Mutual Information (HGMI) and present a way to compute HGMI with theoretical guarantee.
We demonstrate the effectiveness of this hierarchical graph modeling and the proposed SEAL-CI method on text and social network data.
arXiv Detail & Related papers (2022-06-11T04:05:29Z) - CGMN: A Contrastive Graph Matching Network for Self-Supervised Graph
Similarity Learning [65.1042892570989]
We propose a contrastive graph matching network (CGMN) for self-supervised graph similarity learning.
We employ two strategies, namely cross-view interaction and cross-graph interaction, for effective node representation learning.
We transform node representations into graph-level representations via pooling operations for graph similarity computation.
arXiv Detail & Related papers (2022-05-30T13:20:26Z) - Graph Self-supervised Learning with Accurate Discrepancy Learning [64.69095775258164]
We propose a framework that aims to learn the exact discrepancy between the original and the perturbed graphs, coined as Discrepancy-based Self-supervised LeArning (D-SLA)
We validate our method on various graph-related downstream tasks, including molecular property prediction, protein function prediction, and link prediction tasks, on which our model largely outperforms relevant baselines.
arXiv Detail & Related papers (2022-02-07T08:04:59Z) - Group Contrastive Self-Supervised Learning on Graphs [101.45974132613293]
We study self-supervised learning on graphs using contrastive methods.
We argue that contrasting graphs in multiple subspaces enables graph encoders to capture more abundant characteristics.
arXiv Detail & Related papers (2021-07-20T22:09:21Z) - Generating a Doppelganger Graph: Resembling but Distinct [5.618335078130568]
We propose an approach to generating a doppelganger graph that resembles a given one in many graph properties.
The approach is an orchestration of graph representation learning, generative adversarial networks, and graph realization algorithms.
arXiv Detail & Related papers (2021-01-23T22:08:27Z) - Robust Hierarchical Graph Classification with Subgraph Attention [18.7475578342125]
We introduce the concept of subgraph attention for graphs.
We propose a graph classification algorithm called SubGattPool.
We show that SubGattPool is able to improve the state-of-the-art or remains competitive on multiple publicly available graph classification datasets.
arXiv Detail & Related papers (2020-07-19T10:03:06Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - MxPool: Multiplex Pooling for Hierarchical Graph Representation Learning [7.456657747472885]
We propose MxPool, which concurrently uses multiple graph convolution/pooling networks to build a hierarchical learning structure for graph representation learning tasks.
Our experiments on numerous graph classification benchmarks show that our MxPool has superiority over other state-of-the-art graph representation learning methods.
arXiv Detail & Related papers (2020-04-15T01:05:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.