coVariance Neural Networks
- URL: http://arxiv.org/abs/2205.15856v1
- Date: Tue, 31 May 2022 15:04:43 GMT
- Title: coVariance Neural Networks
- Authors: Saurabh Sihag, Gonzalo Mateos, Corey McMillan, Alejandro Ribeiro
- Abstract summary: Graph neural networks (GNN) are an effective framework that exploit inter-relationships within graph-structured data for learning.
We propose a GNN architecture, called coVariance neural network (VNN), that operates on sample covariance matrices as graphs.
We show that VNN performance is indeed more stable than PCA-based statistical approaches.
- Score: 119.45320143101381
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNN) are an effective framework that exploit
inter-relationships within graph-structured data for learning. Principal
component analysis (PCA) involves the projection of data on the eigenspace of
the covariance matrix and draws similarities with the graph convolutional
filters in GNNs. Motivated by this observation, we propose a GNN architecture,
called coVariance neural network (VNN), that operates on sample covariance
matrices as graphs. We theoretically establish the stability of VNNs to
perturbations in the covariance matrix, thus, implying an advantage over
standard PCA-based data analysis approaches that are prone to instability due
to principal components associated with close eigenvalues. Our experiments on
real-world datasets validate our theoretical results and show that VNN
performance is indeed more stable than PCA-based statistical approaches.
Moreover, our experiments on multi-resolution datasets also demonstrate that
VNNs are amenable to transferability of performance over covariance matrices of
different dimensions; a feature that is infeasible for PCA-based approaches.
Related papers
- Sparse Covariance Neural Networks [15.616852692528594]
We show that Sparse coVariance Neural Networks (S-VNNs) are more stable than nominal VNNs.
We show an improved task performance, stability, and computational efficiency of S-VNNs compared with nominal VNNs.
arXiv Detail & Related papers (2024-10-02T15:37:12Z) - Spatiotemporal Covariance Neural Networks [10.855602842179621]
We introduce a relational learning model that operates on the sample covariance matrix of time series and leverages jointtemporal convolutions to model the data.
We prove the STVNN is stable and improves over temporal PCA-based methods.
arXiv Detail & Related papers (2024-09-16T08:05:58Z) - A Metadata-Driven Approach to Understand Graph Neural Networks [17.240017543449735]
We propose a $textitmetadata-driven$ approach to analyze the sensitivity of GNNs to graph data properties.
Our theoretical findings reveal that datasets with more balanced degree distribution exhibit better linear separability of node representations.
arXiv Detail & Related papers (2023-10-30T04:25:02Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - Transferability of coVariance Neural Networks and Application to
Interpretable Brain Age Prediction using Anatomical Features [119.45320143101381]
Graph convolutional networks (GCN) leverage topology-driven graph convolutional operations to combine information across the graph for inference tasks.
We have studied GCNs with covariance matrices as graphs in the form of coVariance neural networks (VNNs)
VNNs inherit the scale-free data processing architecture from GCNs and here, we show that VNNs exhibit transferability of performance over datasets whose covariance matrices converge to a limit object.
arXiv Detail & Related papers (2023-05-02T22:15:54Z) - Batch-Ensemble Stochastic Neural Networks for Out-of-Distribution
Detection [55.028065567756066]
Out-of-distribution (OOD) detection has recently received much attention from the machine learning community due to its importance in deploying machine learning models in real-world applications.
In this paper we propose an uncertainty quantification approach by modelling the distribution of features.
We incorporate an efficient ensemble mechanism, namely batch-ensemble, to construct the batch-ensemble neural networks (BE-SNNs) and overcome the feature collapse problem.
We show that BE-SNNs yield superior performance on several OOD benchmarks, such as the Two-Moons dataset, the FashionMNIST vs MNIST dataset, FashionM
arXiv Detail & Related papers (2022-06-26T16:00:22Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - A Coupled CP Decomposition for Principal Components Analysis of
Symmetric Networks [11.988825533369686]
We propose a principal components analysis (PCA) framework for sequence network data.
We derive efficient algorithms for computing our proposed "Coupled CP" decomposition.
We demonstrate the effectiveness of our proposal on simulated data and on examples from political science and financial economics.
arXiv Detail & Related papers (2022-02-09T20:52:19Z) - Stability of Neural Networks on Manifolds to Relative Perturbations [118.84154142918214]
Graph Neural Networks (GNNs) show impressive performance in many practical scenarios.
GNNs can scale well on large size graphs, but this is contradicted by the fact that existing stability bounds grow with the number of nodes.
arXiv Detail & Related papers (2021-10-10T04:37:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.