GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection
- URL: http://arxiv.org/abs/2401.09193v1
- Date: Wed, 17 Jan 2024 13:04:23 GMT
- Title: GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection
- Authors: Alessandro Bicciato, Luca Cosmo, Giorgia Minello, Luca Rossi, Andrea
Torsello
- Abstract summary: Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
- Score: 51.608147732998994
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Graph neural networks are increasingly becoming the framework of choice for
graph-based machine learning. In this paper, we propose a new graph neural
network architecture that substitutes classical message passing with an
analysis of the local distribution of node features. To this end, we extract
the distribution of features in the egonet for each local neighbourhood and
compare them against a set of learned label distributions by taking the
histogram intersection kernel. The similarity information is then propagated to
other nodes in the network, effectively creating a message passing-like
mechanism where the message is determined by the ensemble of the features. We
perform an ablation study to evaluate the network's performance under different
choices of its hyper-parameters. Finally, we test our model on standard graph
classification and regression benchmarks, and we find that it outperforms
widely used alternative approaches, including both graph kernels and graph
neural networks.
Related papers
- Cooperative Graph Neural Networks [7.2459816681395095]
A class of graph neural networks follow a standard message-passing paradigm.
We propose a novel framework for training graph neural networks.
Our approach offers a more flexible and dynamic message-passing paradigm.
arXiv Detail & Related papers (2023-10-02T15:08:52Z) - The Map Equation Goes Neural: Mapping Network Flows with Graph Neural Networks [0.716879432974126]
Community detection is an essential tool for unsupervised data exploration and revealing the organisational structure of networked systems.
We consider the map equation, a popular information-theoretic objective function for unsupervised community detection, and express it in differentiable tensor form for gradient through descent.
Our formulation turns the map equation compatible with any neural network architecture, enables end-to-end learning, incorporates node features, and chooses the optimal number of clusters automatically.
arXiv Detail & Related papers (2023-10-02T12:32:18Z) - Generative Graph Neural Networks for Link Prediction [13.643916060589463]
Inferring missing links or detecting spurious ones based on observed graphs, known as link prediction, is a long-standing challenge in graph data analysis.
This paper proposes a novel and radically different link prediction algorithm based on the network reconstruction theory, called GraphLP.
Unlike the discriminative neural network models used for link prediction, GraphLP is generative, which provides a new paradigm for neural-network-based link prediction.
arXiv Detail & Related papers (2022-12-31T10:07:19Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - An Energy-Based View of Graph Neural Networks [0.0]
Graph neural networks are a popular variant of neural networks that work with graph-structured data.
We propose a novel method to ensure generation over features as well as the adjacency matrix.
Our approach obtains comparable discriminative performance while improving robustness.
arXiv Detail & Related papers (2021-04-27T21:54:30Z) - Progressive Graph Convolutional Networks for Semi-Supervised Node
Classification [97.14064057840089]
Graph convolutional networks have been successful in addressing graph-based tasks such as semi-supervised node classification.
We propose a method to automatically build compact and task-specific graph convolutional networks.
arXiv Detail & Related papers (2020-03-27T08:32:16Z) - Convolutional Kernel Networks for Graph-Structured Data [37.13712126432493]
We introduce a family of multilayer graph kernels and establish new links between graph convolutional neural networks and kernel methods.
Our approach generalizes convolutional kernel networks to graph-structured data, by representing graphs as a sequence of kernel feature maps.
Our model can also be trained end-to-end on large-scale data, leading to new types of graph convolutional neural networks.
arXiv Detail & Related papers (2020-03-11T09:44:03Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Analyzing Neural Networks Based on Random Graphs [77.34726150561087]
We perform a massive evaluation of neural networks with architectures corresponding to random graphs of various types.
We find that none of the classical numerical graph invariants by itself allows to single out the best networks.
We also find that networks with primarily short-range connections perform better than networks which allow for many long-range connections.
arXiv Detail & Related papers (2020-02-19T11:04:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.