BScNets: Block Simplicial Complex Neural Networks
- URL: http://arxiv.org/abs/2112.06826v1
- Date: Mon, 13 Dec 2021 17:35:54 GMT
- Title: BScNets: Block Simplicial Complex Neural Networks
- Authors: Yuzhou Chen, Yulia R. Gel, H. Vincent Poor
- Abstract summary: Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
- Score: 79.81654213581977
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Simplicial neural networks (SNN) have recently emerged as the newest
direction in graph learning which expands the idea of convolutional
architectures from node space to simplicial complexes on graphs. Instead of
pre-dominantly assessing pairwise relations among nodes as in the current
practice, simplicial complexes allow us to describe higher-order interactions
and multi-node graph structures. By building upon connection between the
convolution operation and the new block Hodge-Laplacian, we propose the first
SNN for link prediction. Our new Block Simplicial Complex Neural Networks
(BScNets) model generalizes the existing graph convolutional network (GCN)
frameworks by systematically incorporating salient interactions among multiple
higher-order graph structures of different dimensions. We discuss theoretical
foundations behind BScNets and illustrate its utility for link prediction on
eight real-world and synthetic datasets. Our experiments indicate that BScNets
outperforms the state-of-the-art models by a significant margin while
maintaining low computation costs. Finally, we show utility of BScNets as the
new promising alternative for tracking spread of infectious diseases such as
COVID-19 and measuring the effectiveness of the healthcare risk mitigation
strategies.
Related papers
- Binarized Simplicial Convolutional Neural Networks [6.069611493148632]
We propose a novel neural network architecture on simplicial complexes named Binarized Simplicial Convolutional Neural Networks (Bi-SCNN)
Compared to the previous Simplicial Convolutional Neural Networks, the reduced model complexity of Bi-SCNN shortens the execution time without sacrificing the prediction performance.
Experiments with real-world citation and ocean-drifter data confirmed that our proposed Bi-SCNN is efficient and accurate.
arXiv Detail & Related papers (2024-05-07T08:05:20Z) - A parameterised model for link prediction using node centrality and
similarity measure based on graph embedding [5.507008181141738]
Link prediction is a key aspect of graph machine learning.
It involves predicting new links that may form between network nodes.
Existing models have significant shortcomings.
We present the Node Centrality and Similarity Based.
Model (NCSM), a novel method for link prediction tasks.
arXiv Detail & Related papers (2023-09-11T13:13:54Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Simplicial Attention Networks [4.401427499962144]
Simplicial Neural Networks (SNNs) naturally model interactions by performing message passing on simplicial complexes.
We propose Simplicial Attention Networks (SAT), a new type of simplicial network that dynamically weighs the interactions between neighbouring simplicies.
We demonstrate that SAT outperforms existent convolutional SNNs and GNNs in two image and trajectory classification tasks.
arXiv Detail & Related papers (2022-04-20T13:41:50Z) - Simplicial Neural Networks [0.0]
We present simplicial neural networks (SNNs)
SNNs are a generalization of graph neural networks to data that live on a class of topological spaces called simplicial complexes.
We test the SNNs on the task of imputing missing data on coauthorship complexes.
arXiv Detail & Related papers (2020-10-07T20:15:01Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.