Flow-Attentional Graph Neural Networks
- URL: http://arxiv.org/abs/2506.06127v1
- Date: Fri, 06 Jun 2025 14:37:50 GMT
- Title: Flow-Attentional Graph Neural Networks
- Authors: Pascal Plettenberg, Dominik Köhler, Bernhard Sick, Josephine M. Thomas,
- Abstract summary: Graph Neural Networks (GNNs) have become essential for learning from graph-structured data.<n>Existing GNNs do not consider the conservation law inherent in graphs associated with a flow of physical resources.<n>We show that flow attention enhances the performance of attention-based GNNs on both graph-level classification and regression tasks.
- Score: 1.49199020343864
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) have become essential for learning from graph-structured data. However, existing GNNs do not consider the conservation law inherent in graphs associated with a flow of physical resources, such as electrical current in power grids or traffic in transportation networks, which can lead to reduced model performance. To address this, we propose flow attention, which adapts existing graph attention mechanisms to satisfy Kirchhoff\'s first law. Furthermore, we discuss how this modification influences the expressivity and identify sets of non-isomorphic graphs that can be discriminated by flow attention but not by standard attention. Through extensive experiments on two flow graph datasets (electronic circuits and power grids), we demonstrate that flow attention enhances the performance of attention-based GNNs on both graph-level classification and regression tasks.
Related papers
- Topology-aware Neural Flux Prediction Guided by Physics [13.352980442733987]
Graph Neural Networks (GNNs) often struggle in preserving high-frequency components of nodal signals when dealing with directed graphs.<n>This paper proposes a novel framework that combines 1) explicit difference matrices that model directional gradients and 2) implicit physical constraints that enforce messages passing within GNNs to be consistent with natural laws.
arXiv Detail & Related papers (2025-06-06T02:01:50Z) - Graph Attention for Heterogeneous Graphs with Positional Encoding [0.0]
Graph Neural Networks (GNNs) have emerged as the de facto standard for modeling graph data.<n>This work benchmarks various GNN architectures to identify the most effective methods for heterogeneous graphs.<n>Our findings reveal that graph attention networks excel in these tasks.
arXiv Detail & Related papers (2025-04-03T18:00:02Z) - Locality-Aware Graph-Rewiring in GNNs [5.356465360780597]
Graph Neural Networks (GNNs) are popular models for machine learning on graphs.
In this work, we identify three desiderata for graph-rewiring: (i) reduce over-squashing, (ii) respect the locality of the graph, and (iii) preserve the sparsity of the graph.
We propose a novel rewiring framework that satisfies all of (i)iii through a locality-aware sequence of rewiring operations.
arXiv Detail & Related papers (2023-10-02T21:59:44Z) - Demystifying Oversmoothing in Attention-Based Graph Neural Networks [23.853636836842604]
Oversmoothing in Graph Neural Networks (GNNs) refers to the phenomenon where increasing network depth leads to homogeneous node representations.
Previous work has established that Graph Convolutional Networks (GCNs) exponentially lose expressive power.
It remains controversial whether the graph attention mechanism can mitigate oversmoothing.
arXiv Detail & Related papers (2023-05-25T14:31:59Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Discriminability of Single-Layer Graph Neural Networks [172.5042368548269]
Graph neural networks (GNNs) have exhibited promising performance on a wide range of problems.
We focus on the property of discriminability and establish conditions under which the inclusion of pointwise nonlinearities to a stable graph filter bank leads to an increased discriminative capacity for high-eigenvalue content.
arXiv Detail & Related papers (2020-10-17T18:52:34Z) - Bayesian Spatio-Temporal Graph Convolutional Network for Traffic
Forecasting [22.277878492878475]
We propose a Bayesian S-temporal ConTemporal Graphal Network (BSTGCN) for traffic prediction.
The graph structure in our network is learned from the physical topology of the road network and traffic data in an end-to-end manner.
We verify the effectiveness of our method on two real-world datasets, and the experimental results demonstrate that BSTGCN attains superior performance compared with state-of-the-art methods.
arXiv Detail & Related papers (2020-10-15T03:41:37Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.