GC-Flow: A Graph-Based Flow Network for Effective Clustering
- URL: http://arxiv.org/abs/2305.17284v1
- Date: Fri, 26 May 2023 22:11:38 GMT
- Title: GC-Flow: A Graph-Based Flow Network for Effective Clustering
- Authors: Tianchun Wang, Farzaneh Mirzazadeh, Xiang Zhang, Jie Chen
- Abstract summary: Graph convolutional networks (GCNs) are emphdiscriminative models that directly model the class posterior $p(y|mathbfx)$ for semi-supervised classification of graph data.
In this work, we design normalizing flows that replace GCN layers, leading to a emphgenerative model that models both the class conditional likelihood $p(mathbfx|y)$ and the class prior $p(y)$.
The resulting neural network, GC-Flow, retains the graph convolution operations while being equipped with
- Score: 10.354035049272095
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph convolutional networks (GCNs) are \emph{discriminative models} that
directly model the class posterior $p(y|\mathbf{x})$ for semi-supervised
classification of graph data. While being effective, as a representation
learning approach, the node representations extracted from a GCN often miss
useful information for effective clustering, because the objectives are
different. In this work, we design normalizing flows that replace GCN layers,
leading to a \emph{generative model} that models both the class conditional
likelihood $p(\mathbf{x}|y)$ and the class prior $p(y)$. The resulting neural
network, GC-Flow, retains the graph convolution operations while being equipped
with a Gaussian mixture representation space. It enjoys two benefits: it not
only maintains the predictive power of GCN, but also produces well-separated
clusters, due to the structuring of the representation space. We demonstrate
these benefits on a variety of benchmark data sets. Moreover, we show that
additional parameterization, such as that on the adjacency matrix used for
graph convolutions, yields additional improvement in clustering.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Dual Contrastive Attributed Graph Clustering Network [6.796682703663566]
We propose a generic framework called Dual Contrastive Attributed Graph Clustering Network (DCAGC)
In DCAGC, by leveraging Neighborhood Contrast Module, the similarity of the neighbor nodes will be maximized and the quality of the node representation will be improved.
All the modules of DCAGC are trained and optimized in a unified framework, so the learned node representation contains clustering-oriented messages.
arXiv Detail & Related papers (2022-06-16T03:17:01Z) - Shift-Robust Node Classification via Graph Adversarial Clustering [43.62586751992269]
Graph Neural Networks (GNNs) are de facto node classification models in graph structured data.
During testing-time, these algorithms assume no data shift.
We propose Shift-Robust Node Classification (SRNC) to address these limitations.
arXiv Detail & Related papers (2022-03-07T18:13:21Z) - AnchorGAE: General Data Clustering via $O(n)$ Bipartite Graph
Convolution [79.44066256794187]
We show how to convert a non-graph dataset into a graph by introducing the generative graph model, which is used to build graph convolution networks (GCNs)
A bipartite graph constructed by anchors is updated dynamically to exploit the high-level information behind data.
We theoretically prove that the simple update will lead to degeneration and a specific strategy is accordingly designed.
arXiv Detail & Related papers (2021-11-12T07:08:13Z) - CatGCN: Graph Convolutional Networks with Categorical Node Features [99.555850712725]
CatGCN is tailored for graph learning when the node features are categorical.
We train CatGCN in an end-to-end fashion and demonstrate it on semi-supervised node classification.
arXiv Detail & Related papers (2020-09-11T09:25:17Z) - CAGNN: Cluster-Aware Graph Neural Networks for Unsupervised Graph
Representation Learning [19.432449825536423]
Unsupervised graph representation learning aims to learn low-dimensional node embeddings without supervision.
We present a novel cluster-aware graph neural network (CAGNN) model for unsupervised graph representation learning using self-supervised techniques.
arXiv Detail & Related papers (2020-09-03T13:57:18Z) - Pseudoinverse Graph Convolutional Networks: Fast Filters Tailored for
Large Eigengaps of Dense Graphs and Hypergraphs [0.0]
Graph Convolutional Networks (GCNs) have proven to be successful tools for semi-supervised classification on graph-based datasets.
We propose a new GCN variant whose three-part filter space is targeted at dense graphs.
arXiv Detail & Related papers (2020-08-03T08:48:41Z) - Graph Highway Networks [77.38665506495553]
Graph Convolution Networks (GCN) are widely used in learning graph representations due to their effectiveness and efficiency.
They suffer from the notorious over-smoothing problem, in which the learned representations converge to alike vectors when many layers are stacked.
We propose Graph Highway Networks (GHNet) which utilize gating units to balance the trade-off between homogeneity and heterogeneity in the GCN learning process.
arXiv Detail & Related papers (2020-04-09T16:26:43Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z) - Revisiting Graph based Collaborative Filtering: A Linear Residual Graph
Convolutional Network Approach [55.44107800525776]
Graph Convolutional Networks (GCNs) are state-of-the-art graph based representation learning models.
In this paper, we revisit GCN based Collaborative Filtering (CF) based Recommender Systems (RS)
We show that removing non-linearities would enhance recommendation performance, consistent with the theories in simple graph convolutional networks.
We propose a residual network structure that is specifically designed for CF with user-item interaction modeling.
arXiv Detail & Related papers (2020-01-28T04:41:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.