Adaptive Least Mean Squares Graph Neural Networks and Online Graph
Signal Estimation
- URL: http://arxiv.org/abs/2401.15304v1
- Date: Sat, 27 Jan 2024 05:47:12 GMT
- Title: Adaptive Least Mean Squares Graph Neural Networks and Online Graph
Signal Estimation
- Authors: Yi Yan, Changran Peng, Ercan Engin Kuruoglu
- Abstract summary: We propose an efficient Neural Network architecture for the online estimation of time-varying graph signals.
The Adaptive Least Mean Squares Graph Neural Networks (LMS-GNN) is a combination of adaptive graph filters and Graph Neural Networks (GNN)
Experimenting on real-world temperature data reveals that our LMS-GNN achieves more accurate online predictions compared to graph-based methods.
- Score: 3.6448362316632115
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The online prediction of multivariate signals, existing simultaneously in
space and time, from noisy partial observations is a fundamental task in
numerous applications. We propose an efficient Neural Network architecture for
the online estimation of time-varying graph signals named the Adaptive Least
Mean Squares Graph Neural Networks (LMS-GNN). LMS-GNN aims to capture the time
variation and bridge the cross-space-time interactions under the condition that
signals are corrupted by noise and missing values. The LMS-GNN is a combination
of adaptive graph filters and Graph Neural Networks (GNN). At each time step,
the forward propagation of LMS-GNN is similar to adaptive graph filters where
the output is based on the error between the observation and the prediction
similar to GNN. The filter coefficients are updated via backpropagation as in
GNN. Experimenting on real-world temperature data reveals that our LMS-GNN
achieves more accurate online predictions compared to graph-based methods like
adaptive graph filters and graph convolutional neural networks.
Related papers
- Adaptive Least Mean pth Power Graph Neural Networks [5.4004917284050835]
We propose a universal framework combining adaptive filter and graph neural network for online graph signal estimation.
LMP-GNN retains the advantage of adaptive filtering in handling noise and missing observations as well as the online update capability.
Experiment results on two real-world datasets of temperature graph and traffic graph under four different noise distributions prove the effectiveness and robustness of our proposed LMP-GNN.
arXiv Detail & Related papers (2024-05-07T08:28:51Z) - Learning Stable Graph Neural Networks via Spectral Regularization [18.32587282139282]
Stability of graph neural networks (GNNs) characterizes how GNNs react to graph perturbations and provides guarantees for architecture performance in noisy scenarios.
This paper develops a self-regularized graph neural network (SR-GNN) that improves the architecture stability by regularizing the filter frequency responses in the graph spectral domain.
arXiv Detail & Related papers (2022-11-13T17:27:21Z) - Edge Graph Neural Networks for Massive MIMO Detection [15.970981766599035]
Massive Multiple-Input Multiple-Out (MIMO) detection is an important problem in modern wireless communication systems.
While traditional Belief Propagation (BP) detectors perform poorly on loopy graphs, the recent Graph Neural Networks (GNNs)-based method can overcome the drawbacks of BP and achieve superior performance.
arXiv Detail & Related papers (2022-05-22T08:01:47Z) - Sparsification and Filtering for Spatial-temporal GNN in Multivariate
Time-series [0.0]
We propose an end-to-end architecture for multivariate time-series prediction that integrates a spatial-temporal graph neural network with a matrix filtering module.
This module generates filtered (inverse) correlation graphs from multivariate time series before inputting them into a GNN.
In contrast with existing sparsification methods adopted in graph neural network, our model explicitly leverage time-series filtering to overcome the low signal-to-noise ratio typical of complex systems data.
arXiv Detail & Related papers (2022-03-08T10:44:30Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Stochastic Graph Neural Networks [123.39024384275054]
Graph neural networks (GNNs) model nonlinear representations in graph data with applications in distributed agent coordination, control, and planning.
Current GNN architectures assume ideal scenarios and ignore link fluctuations that occur due to environment, human factors, or external attacks.
In these situations, the GNN fails to address its distributed task if the topological randomness is not considered accordingly.
arXiv Detail & Related papers (2020-06-04T08:00:00Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.