Meta-Aggregator: Learning to Aggregate for 1-bit Graph Neural Networks
- URL: http://arxiv.org/abs/2109.12872v1
- Date: Mon, 27 Sep 2021 08:50:37 GMT
- Title: Meta-Aggregator: Learning to Aggregate for 1-bit Graph Neural Networks
- Authors: Yongcheng Jing, Yiding Yang, Xinchao Wang, Mingli Song, Dacheng Tao
- Abstract summary: We develop a vanilla 1-bit framework that binarizes both the GNN parameters and the graph features.
Despite the lightweight architecture, we observed that this vanilla framework suffered from insufficient discriminative power in distinguishing graph topologies.
This discovery motivates us to devise meta aggregators to improve the expressive power of vanilla binarized GNNs.
- Score: 127.32203532517953
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we study a novel meta aggregation scheme towards binarizing
graph neural networks (GNNs). We begin by developing a vanilla 1-bit GNN
framework that binarizes both the GNN parameters and the graph features.
Despite the lightweight architecture, we observed that this vanilla framework
suffered from insufficient discriminative power in distinguishing graph
topologies, leading to a dramatic drop in performance. This discovery motivates
us to devise meta aggregators to improve the expressive power of vanilla
binarized GNNs, of which the aggregation schemes can be adaptively changed in a
learnable manner based on the binarized features. Towards this end, we propose
two dedicated forms of meta neighborhood aggregators, an exclusive meta
aggregator termed as Greedy Gumbel Neighborhood Aggregator (GNA), and a
diffused meta aggregator termed as Adaptable Hybrid Neighborhood Aggregator
(ANA). GNA learns to exclusively pick one single optimal aggregator from a pool
of candidates, while ANA learns a hybrid aggregation behavior to simultaneously
retain the benefits of several individual aggregators. Furthermore, the
proposed meta aggregators may readily serve as a generic plugin module into
existing full-precision GNNs. Experiments across various domains demonstrate
that the proposed method yields results superior to the state of the art.
Related papers
- Pre-trained Graphformer-based Ranking at Web-scale Search (Extended Abstract) [56.55728466130238]
We introduce the novel MPGraf model, which aims to integrate the regression capabilities of Transformers with the link prediction strengths of GNNs.
We conduct extensive offline and online experiments to rigorously evaluate the performance of MPGraf.
arXiv Detail & Related papers (2024-09-25T03:33:47Z) - HAGNN: Hybrid Aggregation for Heterogeneous Graph Neural Networks [15.22198175691658]
Heterogeneous graph neural networks (GNNs) have been successful in handling heterogeneous graphs.
Recent work pointed out that simple homogeneous graph model without meta-path can also achieve comparable results.
We propose a novel framework to utilize the rich type semantic information in heterogeneous graphs comprehensively, namely HAGNN.
arXiv Detail & Related papers (2023-07-04T10:40:20Z) - Complete the Missing Half: Augmenting Aggregation Filtering with
Diversification for Graph Convolutional Neural Networks [46.14626839260314]
We show that current Graph Neural Networks (GNNs) are potentially a problematic factor underlying all GNN models for learning on certain datasets.
We augment the aggregation operations with their dual, i.e. diversification operators that make the node more distinct and preserve the identity.
Such augmentation replaces the aggregation with a two-channel filtering process that, in theory, is beneficial for enriching the node representations.
In the experiments, we observe desired characteristics of the models and significant performance boost upon the baselines on 9 node classification tasks.
arXiv Detail & Related papers (2022-12-21T07:24:03Z) - Learnable Commutative Monoids for Graph Neural Networks [0.0]
Graph neural networks (GNNs) are highly sensitive to the choice of aggregation function.
We show that GNNs equipped with recurrent aggregators are competitive with state-of-the-art permutation-invariant aggregators.
We propose a framework for constructing learnable, commutative, associative binary operators.
arXiv Detail & Related papers (2022-12-16T15:43:41Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.