A Review of Graph Neural Networks and Their Applications in Power
Systems
- URL: http://arxiv.org/abs/2101.10025v1
- Date: Mon, 25 Jan 2021 11:50:45 GMT
- Title: A Review of Graph Neural Networks and Their Applications in Power
Systems
- Authors: Wenlong Liao, Birgitte Bak-Jensen, Jayakrishnan Radhakrishna Pillai,
Yuelong Wang, and Yusen Wang
- Abstract summary: Deep neural networks have revolutionized many machine learning tasks in power systems.
The data in these tasks is typically represented in Euclidean domains.
The complexity of graph-structured data has brought challenges to the existing deep neural networks.
- Score: 0.6990493129893112
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep neural networks have revolutionized many machine learning tasks in power
systems, ranging from pattern recognition to signal processing. The data in
these tasks is typically represented in Euclidean domains. Nevertheless, there
is an increasing number of applications in power systems, where data are
collected from non-Euclidean domains and represented as the graph-structured
data with high dimensional features and interdependency among nodes. The
complexity of graph-structured data has brought significant challenges to the
existing deep neural networks defined in Euclidean domains. Recently, many
studies on extending deep neural networks for graph-structured data in power
systems have emerged. In this paper, a comprehensive overview of graph neural
networks (GNNs) in power systems is proposed. Specifically, several classical
paradigms of GNNs structures (e.g., graph convolutional networks, graph
recurrent neural networks, graph attention networks, graph generative networks,
spatial-temporal graph convolutional networks, and hybrid forms of GNNs) are
summarized, and key applications in power systems such as fault diagnosis,
power prediction, power flow calculation, and data generation are reviewed in
detail. Furthermore, main issues and some research trends about the
applications of GNNs in power systems are discussed.
Related papers
- A Systematic Review of Deep Graph Neural Networks: Challenges,
Classification, Architectures, Applications & Potential Utility in
Bioinformatics [0.0]
Graph neural networks (GNNs) employ message transmission between graph nodes to represent graph dependencies.
GNNs have the potential to be an excellent tool for solving a wide range of biological challenges in bioinformatics research.
arXiv Detail & Related papers (2023-11-03T10:25:47Z) - A Survey on Graph Classification and Link Prediction based on GNN [11.614366568937761]
This review article delves into the world of graph convolutional neural networks.
It elaborates on the fundamentals of graph convolutional neural networks.
It elucidates the graph neural network models based on attention mechanisms and autoencoders.
arXiv Detail & Related papers (2023-07-03T09:08:01Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - The Evolution of Distributed Systems for Graph Neural Networks and their
Origin in Graph Processing and Deep Learning: A Survey [17.746899445454048]
Graph Neural Networks (GNNs) are an emerging research field.
GNNs can be applied to various domains including recommendation systems, computer vision, natural language processing, biology and chemistry.
We aim to fill this gap by summarizing and categorizing important methods and techniques for large-scale GNN solutions.
arXiv Detail & Related papers (2023-05-23T09:22:33Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Graph Neural Networks for Communication Networks: Context, Use Cases and
Opportunities [4.4568884144849985]
Graph neural networks (GNNs) have shown outstanding applications in many fields where data is fundamentally represented as graphs.
GNNs represent a new generation of data-driven models that can accurately learn and reproduce the complex behaviors behind real networks.
This article comprises a brief tutorial on GNNs and their possible applications to communication networks.
arXiv Detail & Related papers (2021-12-29T19:09:42Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - Foundations and modelling of dynamic networks using Dynamic Graph Neural
Networks: A survey [11.18312489268624]
We establish a foundation of dynamic networks with consistent, detailed terminology and notation.
We present a comprehensive survey of dynamic graph neural network models using the proposed terminology.
arXiv Detail & Related papers (2020-05-13T23:56:38Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.