Sheaf Neural Networks
- URL: http://arxiv.org/abs/2012.06333v1
- Date: Tue, 8 Dec 2020 01:19:48 GMT
- Title: Sheaf Neural Networks
- Authors: Jakob Hansen and Thomas Gebhart
- Abstract summary: We present a generalization of graph convolutional networks by generalizing the diffusion operation underlying this class of networks.
We show that the resulting sheaf neural networks can outperform graph convolutional networks in domains where relations between nodes are asymmetric and signed.
- Score: 2.4366811507669124
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a generalization of graph convolutional networks by generalizing
the diffusion operation underlying this class of graph neural networks. These
sheaf neural networks are based on the sheaf Laplacian, a generalization of the
graph Laplacian that encodes additional relational structure parameterized by
the underlying graph. The sheaf Laplacian and associated matrices provide an
extended version of the diffusion operation in graph convolutional networks,
providing a proper generalization for domains where relations between nodes are
non-constant, asymmetric, and varying in dimension. We show that the resulting
sheaf neural networks can outperform graph convolutional networks in domains
where relations between nodes are asymmetric and signed.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Generalization Error of Graph Neural Networks in the Mean-field Regime [10.35214360391282]
We explore two widely utilized types of graph neural networks: graph convolutional neural networks and message passing graph neural networks.
Our novel approach involves deriving upper bounds within the mean-field regime for evaluating the generalization error of these graph neural networks.
arXiv Detail & Related papers (2024-02-10T19:12:31Z) - Topological Neural Networks: Mitigating the Bottlenecks of Graph Neural
Networks via Higher-Order Interactions [1.994307489466967]
This work starts with a theoretical framework to reveal the impact of network's width, depth, and graph topology on the over-squashing phenomena in message-passing neural networks.
The work drifts towards, higher-order interactions and multi-relational inductive biases via Topological Neural Networks.
Inspired by Graph Attention Networks, two topological attention networks are proposed: Simplicial and Cell Attention Networks.
arXiv Detail & Related papers (2024-02-10T08:26:06Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - A Survey on Graph Classification and Link Prediction based on GNN [11.614366568937761]
This review article delves into the world of graph convolutional neural networks.
It elaborates on the fundamentals of graph convolutional neural networks.
It elucidates the graph neural network models based on attention mechanisms and autoencoders.
arXiv Detail & Related papers (2023-07-03T09:08:01Z) - Graph Convolutional Networks from the Perspective of Sheaves and the
Neural Tangent Kernel [0.0]
Graph convolutional networks are a popular class of deep neural network algorithms.
Despite their success, graph convolutional networks exhibit a number of peculiar features, including a bias towards learning oversmoothed and homophilic functions.
We propose to bridge this gap by studying the neural tangent kernel of sheaf convolutional networks.
arXiv Detail & Related papers (2022-08-19T12:46:49Z) - Transition to Linearity of General Neural Networks with Directed Acyclic
Graph Architecture [20.44438519046223]
We show that feedforward neural networks corresponding to arbitrary directed acyclic graphs undergo transition to linearity as their "width" approaches infinity.
Our results identify the mathematical structure underlying transition to linearity and generalize a number of recent works aimed at characterizing transition to linearity or constancy of the Neural Tangent Kernel for standard architectures.
arXiv Detail & Related papers (2022-05-24T04:57:35Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.