Generative Models and Learning Algorithms for Core-Periphery Structured
Graphs
- URL: http://arxiv.org/abs/2210.01489v1
- Date: Tue, 4 Oct 2022 09:44:09 GMT
- Title: Generative Models and Learning Algorithms for Core-Periphery Structured
Graphs
- Authors: Sravanthi Gurugubelli and Sundeep Prabhakar Chepuri
- Abstract summary: We focus on learning the core scores of a graph from its node attributes and connectivity structure.
We develop algorithms for inferring the model parameters and core scores of a graph when both the graph structure and node attributes are available.
- Score: 13.90938823562779
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider core-periphery structured graphs, which are graphs with a group
of densely and sparsely connected nodes, respectively, referred to as core and
periphery nodes. The so-called core score of a node is related to the
likelihood of it being a core node. In this paper, we focus on learning the
core scores of a graph from its node attributes and connectivity structure. To
this end, we propose two classes of probabilistic graphical models: affine and
nonlinear. First, we describe affine generative models to model the dependence
of node attributes on its core scores, which determine the graph structure.
Next, we discuss nonlinear generative models in which the partial correlations
of node attributes influence the graph structure through latent core scores. We
develop algorithms for inferring the model parameters and core scores of a
graph when both the graph structure and node attributes are available. When
only the node attributes of graphs are available, we jointly learn a
core-periphery structured graph and its core scores. We provide results from
numerical experiments on several synthetic and real-world datasets to
demonstrate the efficacy of the developed models and algorithms.
Related papers
- Saliency-Aware Regularized Graph Neural Network [39.82009838086267]
We propose the Saliency-Aware Regularized Graph Neural Network (SAR-GNN) for graph classification.
We first estimate the global node saliency by measuring the semantic similarity between the compact graph representation and node features.
Then the learned saliency distribution is leveraged to regularize the neighborhood aggregation of the backbone.
arXiv Detail & Related papers (2024-01-01T13:44:16Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Boosting Graph Structure Learning with Dummy Nodes [41.83708114701956]
We extend graph kernels and graph neural networks with dummy nodes and conduct experiments on graph classification and subgraph isomorphism matching tasks.
We prove that such a dummy node can help build an efficient monomorphic edge-to-vertex transform and an epimorphic inverse to recover the original graph back.
arXiv Detail & Related papers (2022-06-17T05:44:24Z) - Graph Attention Retrospective [14.52271219759284]
Graph-based learning is a rapidly growing sub-field of machine learning with applications in social networks, citation networks, and bioinformatics.
In this paper, we theoretically study the behaviour of graph attention networks.
We show that in an "easy" regime, where the distance between the means of the Gaussians is large enough, graph attention is able to distinguish inter-class from intra-class edges.
In the "hard" regime, we show that every attention mechanism fails to distinguish intra-class from inter-class edges.
arXiv Detail & Related papers (2022-02-26T04:58:36Z) - Learning Sparse Graphs with a Core-periphery Structure [14.112444998191698]
We propose a generative model for data associated with core-periphery structured networks.
We infer a sparse graph and nodal core scores that induce dense (sparse) connections in core parts of the network.
arXiv Detail & Related papers (2021-10-08T10:41:30Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - Convolutional Kernel Networks for Graph-Structured Data [37.13712126432493]
We introduce a family of multilayer graph kernels and establish new links between graph convolutional neural networks and kernel methods.
Our approach generalizes convolutional kernel networks to graph-structured data, by representing graphs as a sequence of kernel feature maps.
Our model can also be trained end-to-end on large-scale data, leading to new types of graph convolutional neural networks.
arXiv Detail & Related papers (2020-03-11T09:44:03Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.