Predicting Dynamic Stability of Power Grids using Graph Neural Networks
- URL: http://arxiv.org/abs/2108.08230v1
- Date: Wed, 18 Aug 2021 16:43:06 GMT
- Title: Predicting Dynamic Stability of Power Grids using Graph Neural Networks
- Authors: Christian Nauck, Michael Lindner, Konstantin Sch\"urholt, Haoming
Zhang, Paul Schultz, J\"urgen Kurths, Ingrid Isenhardt and Frank Hellmann
- Abstract summary: We investigate the feasibility of applying graph neural networks (GNN) to predict dynamic stability of synchronisation in complex power grids.
We generate two synthetic datasets for grids with 20 and 100 nodes respectively and estimate single-node basin stability (SNBS) using Monte-Carlo sampling.
We show that SNBS can be predicted in general and the performance significantly changes using different GNN-models.
- Score: 0.1539132101969243
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The prediction of dynamical stability of power grids becomes more important
and challenging with increasing shares of renewable energy sources due to their
decentralized structure, reduced inertia and volatility. We investigate the
feasibility of applying graph neural networks (GNN) to predict dynamic
stability of synchronisation in complex power grids using the single-node basin
stability (SNBS) as a measure. To do so, we generate two synthetic datasets for
grids with 20 and 100 nodes respectively and estimate SNBS using Monte-Carlo
sampling. Those datasets are used to train and evaluate the performance of
eight different GNN-models. All models use the full graph without
simplifications as input and predict SNBS in a nodal-regression-setup. We show
that SNBS can be predicted in general and the performance significantly changes
using different GNN-models. Furthermore, we observe interesting transfer
capabilities of our approach: GNN-models trained on smaller grids can directly
be applied on larger grids without the need of retraining.
Related papers
- Graph Isomorphic Networks for Assessing Reliability of the
Medium-Voltage Grid [0.5242869847419834]
This paper proposes using Graph Isomorphic Networks (GINs) for n-1 assessments in medium voltage grids.
The GIN framework is designed to generalise to unseen grids and utilise graph structure and data about stations/cables.
The proposed GIN approach demonstrates faster and more reliable grid assessments than a traditional mathematical optimisation approach.
arXiv Detail & Related papers (2023-10-02T13:19:35Z) - Graph Embedding Dynamic Feature-based Supervised Contrastive Learning of
Transient Stability for Changing Power Grid Topologies [4.344709230906635]
GEDF-SCL model uses supervised contrastive learning to predict transient stability with GEDFs.
Test result demonstrated that GEDF-SCL model can achieve high accuracy in transient stability prediction.
arXiv Detail & Related papers (2023-08-01T13:30:36Z) - Toward Dynamic Stability Assessment of Power Grid Topologies using Graph
Neural Networks [0.0]
Renewables introduce new challenges to power grids regarding the dynamic stability due to decentralization, reduced inertia, and volatility in production.
graph neural networks (GNNs) are a promising method to reduce the computational effort of analyzing the dynamic stability of power grids.
GNNs are surprisingly effective at predicting the highly non-linear targets from topological information only.
arXiv Detail & Related papers (2022-06-10T07:23:22Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Power Flow Balancing with Decentralized Graph Neural Networks [4.812718493682454]
We propose an end-to-end framework based on a Graph Neural Network (GNN) to balance the power flows in a generic grid.
The proposed framework is efficient and, compared to other solvers based on deep learning, is robust to perturbations not only to the physical quantities on the grid components, but also to the topology.
arXiv Detail & Related papers (2021-11-03T12:14:56Z) - Training Stable Graph Neural Networks Through Constrained Learning [116.03137405192356]
Graph Neural Networks (GNNs) rely on graph convolutions to learn features from network data.
GNNs are stable to different types of perturbations of the underlying graph, a property that they inherit from graph filters.
We propose a novel constrained learning approach by imposing a constraint on the stability condition of the GNN within a perturbation of choice.
arXiv Detail & Related papers (2021-10-07T15:54:42Z) - Dynamic Graph Convolutional Recurrent Network for Traffic Prediction:
Benchmark and Solution [18.309299822858243]
We propose a novel traffic prediction framework, named Dynamic Graph Contemporalal Recurrent Network (DGCRN)
In DGCRN, hyper-networks are designed to leverage and extract dynamic characteristics from node attributes.
We are the first to employ a generation method to model fine iteration of dynamic graph at each time step.
arXiv Detail & Related papers (2021-04-30T11:25:43Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.