Graph Isomorphic Networks for Assessing Reliability of the
Medium-Voltage Grid
- URL: http://arxiv.org/abs/2310.01181v2
- Date: Tue, 3 Oct 2023 09:42:23 GMT
- Title: Graph Isomorphic Networks for Assessing Reliability of the
Medium-Voltage Grid
- Authors: Charlotte Cambier van Nooten, Tom van de Poll, Sonja F\"ullhase, Jacco
Heres, Tom Heskes, Yuliya Shapovalova
- Abstract summary: This paper proposes using Graph Isomorphic Networks (GINs) for n-1 assessments in medium voltage grids.
The GIN framework is designed to generalise to unseen grids and utilise graph structure and data about stations/cables.
The proposed GIN approach demonstrates faster and more reliable grid assessments than a traditional mathematical optimisation approach.
- Score: 0.5242869847419834
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Ensuring electricity grid reliability becomes increasingly challenging with
the shift towards renewable energy and declining conventional capacities.
Distribution System Operators (DSOs) aim to achieve grid reliability by
verifying the n-1 principle, ensuring continuous operation in case of component
failure. Electricity networks' complex graph-based data holds crucial
information for n-1 assessment: graph structure and data about stations/cables.
Unlike traditional machine learning methods, Graph Neural Networks (GNNs)
directly handle graph-structured data. This paper proposes using Graph
Isomorphic Networks (GINs) for n-1 assessments in medium voltage grids. The GIN
framework is designed to generalise to unseen grids and utilise graph structure
and data about stations/cables. The proposed GIN approach demonstrates faster
and more reliable grid assessments than a traditional mathematical optimisation
approach, reducing prediction times by approximately a factor of 1000. The
findings offer a promising approach to address computational challenges and
enhance the reliability and efficiency of energy grid assessments.
Related papers
- PowerGraph: A power grid benchmark dataset for graph neural networks [7.504044714471332]
We present PowerGraph, which comprises GNN-tailored datasets for power flows, optimal power flows, and cascading failure analyses.
Overall, PowerGraph is a multifaceted GNN dataset for diverse tasks that includes power flow and fault scenarios with real-world explanations.
arXiv Detail & Related papers (2024-02-05T09:24:52Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Graph Transformer GANs for Graph-Constrained House Generation [223.739067413952]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The GTGAN learns effective graph node relations in an end-to-end fashion for the challenging graph-constrained house generation task.
arXiv Detail & Related papers (2023-03-14T20:35:45Z) - EGRC-Net: Embedding-induced Graph Refinement Clustering Network [66.44293190793294]
We propose a novel graph clustering network called Embedding-Induced Graph Refinement Clustering Network (EGRC-Net)
EGRC-Net effectively utilizes the learned embedding to adaptively refine the initial graph and enhance the clustering performance.
Our proposed methods consistently outperform several state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-19T09:08:43Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Graph-based Algorithm Unfolding for Energy-aware Power Allocation in
Wireless Networks [27.600081147252155]
We develop a novel graph sumable framework to maximize energy efficiency in wireless communication networks.
We show the permutation training which is a desirable property for models of wireless network data.
Results demonstrate its generalizability across different network topologies.
arXiv Detail & Related papers (2022-01-27T20:23:24Z) - Power Flow Balancing with Decentralized Graph Neural Networks [4.812718493682454]
We propose an end-to-end framework based on a Graph Neural Network (GNN) to balance the power flows in a generic grid.
The proposed framework is efficient and, compared to other solvers based on deep learning, is robust to perturbations not only to the physical quantities on the grid components, but also to the topology.
arXiv Detail & Related papers (2021-11-03T12:14:56Z) - Predicting Dynamic Stability of Power Grids using Graph Neural Networks [0.1539132101969243]
We investigate the feasibility of applying graph neural networks (GNN) to predict dynamic stability of synchronisation in complex power grids.
We generate two synthetic datasets for grids with 20 and 100 nodes respectively and estimate single-node basin stability (SNBS) using Monte-Carlo sampling.
We show that SNBS can be predicted in general and the performance significantly changes using different GNN-models.
arXiv Detail & Related papers (2021-08-18T16:43:06Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.