Power to the Relational Inductive Bias: Graph Neural Networks in
Electrical Power Grids
- URL: http://arxiv.org/abs/2109.03604v1
- Date: Wed, 8 Sep 2021 12:56:00 GMT
- Title: Power to the Relational Inductive Bias: Graph Neural Networks in
Electrical Power Grids
- Authors: Martin Ringsquandl, Houssem Sellami, Marcel Hildebrandt, Dagmar Beyer,
Sylwia Henselmeyer, Sebastian Weber, Mitchell Joblin
- Abstract summary: We argue that there is a gap between GNN research driven by benchmarks which contain graphs that differ from power grids in several important aspects.
We address this gap by means of (i) defining power grid graph datasets in inductive settings, (ii) an exploratory analysis of graph properties, and (iii) an empirical study of the concrete learning task of state estimation on real-world power grids.
- Score: 1.732048244723033
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The application of graph neural networks (GNNs) to the domain of electrical
power grids has high potential impact on smart grid monitoring. Even though
there is a natural correspondence of power flow to message-passing in GNNs,
their performance on power grids is not well-understood. We argue that there is
a gap between GNN research driven by benchmarks which contain graphs that
differ from power grids in several important aspects. Additionally, inductive
learning of GNNs across multiple power grid topologies has not been explored
with real-world data. We address this gap by means of (i) defining power grid
graph datasets in inductive settings, (ii) an exploratory analysis of graph
properties, and (iii) an empirical study of the concrete learning task of state
estimation on real-world power grids. Our results show that GNNs are more
robust to noise with up to 400% lower error compared to baselines. Furthermore,
due to the unique properties of electrical grids, we do not observe the well
known over-smoothing phenomenon of GNNs and find the best performing models to
be exceptionally deep with up to 13 layers. This is in stark contrast to
existing benchmark datasets where the consensus is that 2 to 3 layer GNNs
perform best. Our results demonstrate that a key challenge in this domain is to
effectively handle long-range dependence.
Related papers
- SafePowerGraph: Safety-aware Evaluation of Graph Neural Networks for Transmission Power Grids [55.35059657148395]
We present SafePowerGraph, the first simulator-agnostic, safety-oriented framework and benchmark for Graph Neural Networks (GNNs) in power systems (PS) operations.
SafePowerGraph integrates multiple PF and OPF simulators and assesses GNN performance under diverse scenarios, including energy price variations and power line outages.
arXiv Detail & Related papers (2024-07-17T09:01:38Z) - PowerGraph: A power grid benchmark dataset for graph neural networks [7.504044714471332]
We present PowerGraph, which comprises GNN-tailored datasets for power flows, optimal power flows, and cascading failure analyses.
Overall, PowerGraph is a multifaceted GNN dataset for diverse tasks that includes power flow and fault scenarios with real-world explanations.
arXiv Detail & Related papers (2024-02-05T09:24:52Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - The Expressive Power of Graph Neural Networks: A Survey [9.08607528905173]
We conduct a first survey for models for enhancing expressive power under different forms of definition.
The models are reviewed based on three categories, i.e., Graph feature enhancement, Graph topology enhancement, and GNNs architecture enhancement.
arXiv Detail & Related papers (2023-08-16T09:12:21Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Spiking Variational Graph Auto-Encoders for Efficient Graph
Representation Learning [10.65760757021534]
We propose an SNN-based deep generative method, namely the Spiking Variational Graph Auto-Encoders (S-VGAE) for efficient graph representation learning.
We conduct link prediction experiments on multiple benchmark graph datasets, and the results demonstrate that our model consumes significantly lower energy with the performances superior or comparable to other ANN- and SNN-based methods for graph representation learning.
arXiv Detail & Related papers (2022-10-24T12:54:41Z) - Power Flow Balancing with Decentralized Graph Neural Networks [4.812718493682454]
We propose an end-to-end framework based on a Graph Neural Network (GNN) to balance the power flows in a generic grid.
The proposed framework is efficient and, compared to other solvers based on deep learning, is robust to perturbations not only to the physical quantities on the grid components, but also to the topology.
arXiv Detail & Related papers (2021-11-03T12:14:56Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.