Power Flow Balancing with Decentralized Graph Neural Networks
- URL: http://arxiv.org/abs/2111.02169v1
- Date: Wed, 3 Nov 2021 12:14:56 GMT
- Title: Power Flow Balancing with Decentralized Graph Neural Networks
- Authors: Jonas Berg Hansen, Stian Normann Anfinsen, Filippo Maria Bianchi
- Abstract summary: We propose an end-to-end framework based on a Graph Neural Network (GNN) to balance the power flows in a generic grid.
The proposed framework is efficient and, compared to other solvers based on deep learning, is robust to perturbations not only to the physical quantities on the grid components, but also to the topology.
- Score: 4.812718493682454
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose an end-to-end framework based on a Graph Neural Network (GNN) to
balance the power flows in a generic grid. The optimization is framed as a
supervised vertex regression task, where the GNN is trained to predict the
current and power injections at each grid branch that yield a power flow
balance. By representing the power grid as a line graph with branches as
vertices, we can train a GNN that is more accurate and robust to changes in the
underlying topology. In addition, by using specialized GNN layers, we are able
to build a very deep architecture that accounts for large neighborhoods on the
graph, while implementing only localized operations. We perform three different
experiments to evaluate: i) the benefits of using localized rather than global
operations and the tendency to oversmooth when using deep GNN models; ii) the
resilience to perturbations in the graph topology; and iii) the capability to
train the model simultaneously on multiple grid topologies and the
consequential improvement in generalization to new, unseen grids. The proposed
framework is efficient and, compared to other solvers based on deep learning,
is robust to perturbations not only to the physical quantities on the grid
components, but also to the topology.
Related papers
- PowerGraph: A power grid benchmark dataset for graph neural networks [7.504044714471332]
We present PowerGraph, which comprises GNN-tailored datasets for power flows, optimal power flows, and cascading failure analyses.
Overall, PowerGraph is a multifaceted GNN dataset for diverse tasks that includes power flow and fault scenarios with real-world explanations.
arXiv Detail & Related papers (2024-02-05T09:24:52Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - T-GAE: Transferable Graph Autoencoder for Network Alignment [79.89704126746204]
T-GAE is a graph autoencoder framework that leverages transferability and stability of GNNs to achieve efficient network alignment without retraining.
Our experiments demonstrate that T-GAE outperforms the state-of-the-art optimization method and the best GNN approach by up to 38.7% and 50.8%, respectively.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - Re-Think and Re-Design Graph Neural Networks in Spaces of Continuous
Graph Diffusion Functionals [7.6435511285856865]
Graph neural networks (GNNs) are widely used in domains like social networks and biological systems.
locality assumption of GNNs hampers their ability to capture long-range dependencies and global patterns in graphs.
We propose a new inductive bias based on variational analysis, drawing inspiration from the Brachchronistoe problem.
arXiv Detail & Related papers (2023-07-01T04:44:43Z) - Fast and Effective GNN Training with Linearized Random Spanning Trees [20.73637495151938]
We present a new effective and scalable framework for training GNNs in node classification tasks.
Our approach progressively refines the GNN weights on an extensive sequence of random spanning trees.
The sparse nature of these path graphs substantially lightens the computational burden of GNN training.
arXiv Detail & Related papers (2023-06-07T23:12:42Z) - EGRC-Net: Embedding-induced Graph Refinement Clustering Network [66.44293190793294]
We propose a novel graph clustering network called Embedding-Induced Graph Refinement Clustering Network (EGRC-Net)
EGRC-Net effectively utilizes the learned embedding to adaptively refine the initial graph and enhance the clustering performance.
Our proposed methods consistently outperform several state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-19T09:08:43Z) - Leveraging power grid topology in machine learning assisted optimal
power flow [0.5076419064097734]
Machine learning assisted optimal power flow (OPF) aims to reduce the computational complexity of non-linear and non- constrained power flow problems.
We assess the performance of a variety of FCNN, CNN and GNN models for two fundamental approaches to machine assisted OPF.
For several synthetic grids with interconnected utilities, we show that locality properties between feature and target variables are scarce.
arXiv Detail & Related papers (2021-10-01T10:39:53Z) - Fast Power Control Adaptation via Meta-Learning for Random Edge Graph
Neural Networks [39.59987601426039]
This paper studies the higher-level problem of enabling fast adaption of the power control policy to time-varying topologies.
We apply first-order meta-learning on data from multiple topologies with the aim of optimizing for a few-shot adaptation to new network configurations.
arXiv Detail & Related papers (2021-05-02T12:43:10Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.