MagNet: A Magnetic Neural Network for Directed Graphs
- URL: http://arxiv.org/abs/2102.11391v1
- Date: Mon, 22 Feb 2021 22:40:57 GMT
- Title: MagNet: A Magnetic Neural Network for Directed Graphs
- Authors: Xitong Zhang and Nathan Brugnone and Michael Perlmutter and Matthew
Hirn
- Abstract summary: MagNet is a spectral GNN for directed graphs based on a complex Hermitian matrix known as the magnetic Laplacian.
We show that MagNet's performance exceeds other spectral GNNs on directed graph node classification and link prediction tasks.
- Score: 3.5557219875516655
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The prevalence of graph-based data has spurred the rapid development of graph
neural networks (GNNs) and related machine learning algorithms. Yet, despite
the many data sets naturally modeled as directed graphs, including citation,
website, and traffic networks, the vast majority of this research focuses on
undirected graphs. In this paper, we propose MagNet, a spectral GNN for
directed graphs based on a complex Hermitian matrix known as the magnetic
Laplacian. This matrix encodes undirected geometric structure in the magnitude
of its entries and directional information in the phase of its entries. A
"charge" parameter attunes spectral information to variation among directed
cycles. We show that MagNet's performance exceeds other spectral GNNs on
directed graph node classification and link prediction tasks for a variety of
datasets and exceeds commonly used spatial GNNs on a majority of such. The
underlying principles of MagNet are such that it can be adapted to other
spectral GNN architectures.
Related papers
- Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - A Magnetic Framelet-Based Convolutional Neural Network for Directed
Graphs [33.36530820082491]
We introduce Framelet-MagNet, a framelet-based spectral GCNN for directed graphs (digraphs)
The model applies the framelet transform to digraph signals to form a more sophisticated representation for filtering.
We empirically validate the predictive power of Framelet-MagNet over a range of state-of-the-art models in node classification, link prediction, and denoising.
arXiv Detail & Related papers (2022-10-20T03:37:29Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - DiffWire: Inductive Graph Rewiring via the Lov\'asz Bound [1.0323063834827415]
Graph Neural Networks (GNNs) have been shown to achieve competitive results to tackle graph-related tasks.
MPNNs have been reported to suffer from over-smoothing, over-squashing and under-reaching.
We propose DiffWire, a novel framework for graph rewiring in MPNNs that is principled, fully differentiable and parameter-free.
arXiv Detail & Related papers (2022-06-15T08:22:07Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.