SSSNET: Semi-Supervised Signed Network Clustering
- URL: http://arxiv.org/abs/2110.06623v1
- Date: Wed, 13 Oct 2021 10:36:37 GMT
- Title: SSSNET: Semi-Supervised Signed Network Clustering
- Authors: Yixuan He, Gesine Reinert, Songchao Wang, Mihai Cucuringu
- Abstract summary: We introduce a novel probabilistic balanced normalized cut loss for training nodes in a GNN framework for semi-supervised signed network clustering, called SSSNET.
The main novelty approach is a new take on the role of social balance theory for signed network embeddings.
- Score: 4.895808607591299
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Node embeddings are a powerful tool in the analysis of networks; yet, their
full potential for the important task of node clustering has not been fully
exploited. In particular, most state-of-the-art methods generating node
embeddings of signed networks focus on link sign prediction, and those that
pertain to node clustering are usually not graph neural network (GNN) methods.
Here, we introduce a novel probabilistic balanced normalized cut loss for
training nodes in a GNN framework for semi-supervised signed network
clustering, called SSSNET. The method is end-to-end in combining embedding
generation and clustering without an intermediate step; it has node clustering
as main focus, with an emphasis on polarization effects arising in networks.
The main novelty of our approach is a new take on the role of social balance
theory for signed network embeddings. The standard heuristic for justifying the
criteria for the embeddings hinges on the assumption that "an enemy's enemy is
a friend". Here, instead, a neutral stance is assumed on whether or not the
enemy of an enemy is a friend. Experimental results on various data sets,
including a synthetic signed stochastic block model, a polarized version of it,
and real-world data at different scales, demonstrate that SSSNET can achieve
comparable or better results than state-of-the-art spectral clustering methods,
for a wide range of noise and sparsity levels. SSSNET complements existing
methods through the possibility of including exogenous information, in the form
of node-level features or labels.
Related papers
- Rethinking Independent Cross-Entropy Loss For Graph-Structured Data [41.92169850308025]
Graph neural networks (GNNs) have exhibited prominent performance in learning graph-structured data.
In this work, we propose a new framework, termed joint-cluster supervised learning, to model the joint distribution of each node with its corresponding cluster.
In this way, the data-label reference signals extracted from the local cluster explicitly strengthen the discrimination ability on the target node.
arXiv Detail & Related papers (2024-05-24T13:52:41Z) - Applying Self-supervised Learning to Network Intrusion Detection for
Network Flows with Graph Neural Network [8.318363497010969]
This paper studies the application of GNNs to identify the specific types of network flows in an unsupervised manner.
To the best of our knowledge, it is the first GNN-based self-supervised method for the multiclass classification of network flows in NIDS.
arXiv Detail & Related papers (2024-03-03T12:34:13Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Exact Recovery and Bregman Hard Clustering of Node-Attributed Stochastic
Block Model [0.16385815610837165]
This paper presents an information-theoretic criterion for the exact recovery of community labels.
It shows how network and attribute information can be exchanged in order to have exact recovery.
It also presents an iterative clustering algorithm that maximizes the joint likelihood.
arXiv Detail & Related papers (2023-10-30T16:46:05Z) - Domain-adaptive Message Passing Graph Neural Network [67.35534058138387]
Cross-network node classification (CNNC) aims to classify nodes in a label-deficient target network by transferring the knowledge from a source network with abundant labels.
We propose a domain-adaptive message passing graph neural network (DM-GNN), which integrates graph neural network (GNN) with conditional adversarial domain adaptation.
arXiv Detail & Related papers (2023-08-31T05:26:08Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - Dink-Net: Neural Clustering on Large Graphs [59.10189693120368]
A deep graph clustering method (Dink-Net) is proposed with the idea of dilation and shrink.
By discriminating nodes, whether being corrupted by augmentations, representations are learned in a self-supervised manner.
The clustering distribution is optimized by minimizing the proposed cluster dilation loss and cluster shrink loss.
Compared to the runner-up, Dink-Net 9.62% achieves NMI improvement on the ogbn-papers100M dataset with 111 million nodes and 1.6 billion edges.
arXiv Detail & Related papers (2023-05-28T15:33:24Z) - Learning Hierarchical Graph Neural Networks for Image Clustering [81.5841862489509]
We propose a hierarchical graph neural network (GNN) model that learns how to cluster a set of images into an unknown number of identities.
Our hierarchical GNN uses a novel approach to merge connected components predicted at each level of the hierarchy to form a new graph at the next level.
arXiv Detail & Related papers (2021-07-03T01:28:42Z) - DIGRAC: Digraph Clustering with Flow Imbalance [5.5023982421074855]
We introduce a graph neural network framework with a novel Directed Mixed Path Aggregation scheme to obtain node embeddings for directed networks.
We show that our method attains state-of-the-art results on directed clustering, for a wide range of noise and sparsity levels, as well as graph structures.
arXiv Detail & Related papers (2021-06-09T16:33:13Z) - ImGAGN:Imbalanced Network Embedding via Generative Adversarial Graph
Networks [19.45752945234785]
Imbalanced classification on graphs is ubiquitous yet challenging in many real-world applications, such as fraudulent node detection.
We present a generative adversarial graph network model, called ImGAGN, to address the imbalanced classification problem on graphs.
We show that the proposed method ImGAGN outperforms state-of-the-art algorithms for semi-supervised imbalanced node classification task.
arXiv Detail & Related papers (2021-06-05T06:56:37Z) - Graph Prototypical Networks for Few-shot Learning on Attributed Networks [72.31180045017835]
We propose a graph meta-learning framework -- Graph Prototypical Networks (GPN)
GPN is able to perform textitmeta-learning on an attributed network and derive a highly generalizable model for handling the target classification task.
arXiv Detail & Related papers (2020-06-23T04:13:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.