Complex-Value Spatio-temporal Graph Convolutional Neural Networks and
its Applications to Electric Power Systems AI
- URL: http://arxiv.org/abs/2208.08485v1
- Date: Wed, 17 Aug 2022 18:56:48 GMT
- Title: Complex-Value Spatio-temporal Graph Convolutional Neural Networks and
its Applications to Electric Power Systems AI
- Authors: Tong Wu, Anna Scaglione, Daniel Arnold
- Abstract summary: We generalize graph convolutional neural networks (GCN) to the complex domain.
We prove that complex-valued GCNs are stable with respect to perturbations of the underlying graph support.
We apply complex GCN to power grid state forecasting, power grid-attack detection and localization.
- Score: 24.914412344973996
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The effective representation, precessing, analysis, and visualization of
large-scale structured data over graphs are gaining a lot of attention. So far
most of the literature has focused on real-valued signals. However, signals are
often sparse in the Fourier domain, and more informative and compact
representations for them can be obtained using the complex envelope of their
spectral components, as opposed to the original real-valued signals. Motivated
by this fact, in this work we generalize graph convolutional neural networks
(GCN) to the complex domain, deriving the theory that allows to incorporate a
complex-valued graph shift operators (GSO) in the definition of graph filters
(GF) and process complex-valued graph signals (GS). The theory developed can
handle spatio-temporal complex network processes. We prove that complex-valued
GCNs are stable with respect to perturbations of the underlying graph support,
the bound of the transfer error and the bound of error propagation through
multiply layers. Then we apply complex GCN to power grid state forecasting,
power grid cyber-attack detection and localization.
Related papers
- What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding [67.59552859593985]
Graph Transformers, which incorporate self-attention and positional encoding, have emerged as a powerful architecture for various graph learning tasks.
This paper introduces first theoretical investigation of a shallow Graph Transformer for semi-supervised classification.
arXiv Detail & Related papers (2024-06-04T05:30:16Z) - Demystifying Oversmoothing in Attention-Based Graph Neural Networks [23.853636836842604]
Oversmoothing in Graph Neural Networks (GNNs) refers to the phenomenon where increasing network depth leads to homogeneous node representations.
Previous work has established that Graph Convolutional Networks (GCNs) exponentially lose expressive power.
It remains controversial whether the graph attention mechanism can mitigate oversmoothing.
arXiv Detail & Related papers (2023-05-25T14:31:59Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Grid-to-Graph: Flexible Spatial Relational Inductive Biases for
Reinforcement Learning [8.169818701603313]
We show that we can incorporate relational inductive biases, encoded in the form of relational graphs, into agents.
We propose Grid-to-Graph (GTG), a mapping from grid structures to relational graphs that carry useful inductive biases.
We show that GTG produces agents that can jointly reason over observations and environment encoded dynamics in knowledge bases.
arXiv Detail & Related papers (2021-02-08T14:15:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.