Graph Polynomial Convolution Models for Node Classification of
Non-Homophilous Graphs
- URL: http://arxiv.org/abs/2209.05020v1
- Date: Mon, 12 Sep 2022 04:46:55 GMT
- Title: Graph Polynomial Convolution Models for Node Classification of
Non-Homophilous Graphs
- Authors: Kishan Wimalawarne and Taiji Suzuki
- Abstract summary: We investigate efficient learning from higher-order graph convolution and learning directly from adjacency matrix for node classification.
We show that the resulting model lead to new graphs and residual scaling parameter.
We demonstrate that the proposed methods obtain improved accuracy for node-classification of non-homophilous parameters.
- Score: 52.52570805621925
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We investigate efficient learning from higher-order graph convolution and
learning directly from adjacency matrices for node classification. We revisit
the scaled graph residual network and remove ReLU activation from residual
layers and apply a single weight matrix at each residual layer. We show that
the resulting model lead to new graph convolution models as a polynomial of the
normalized adjacency matrix, the residual weight matrix, and the residual
scaling parameter. Additionally, we propose adaptive learning between directly
graph polynomial convolution models and learning directly from the adjacency
matrix. Furthermore, we propose fully adaptive models to learn scaling
parameters at each residual layer. We show that generalization bounds of
proposed methods are bounded as a polynomial of eigenvalue spectrum, scaling
parameters, and upper bounds of residual weights. By theoretical analysis, we
argue that the proposed models can obtain improved generalization bounds by
limiting the higher-orders of convolutions and direct learning from the
adjacency matrix. Using a wide set of real-data, we demonstrate that the
proposed methods obtain improved accuracy for node-classification of
non-homophilous graphs.
Related papers
- Polynomial Graphical Lasso: Learning Edges from Gaussian Graph-Stationary Signals [18.45931641798935]
This paper introduces Polynomial Graphical Lasso (PGL), a new approach to learning graph structures from nodal signals.
Our key contribution lies in the signals as Gaussian and stationary on the graph, enabling the development of a graph-learning lasso.
arXiv Detail & Related papers (2024-04-03T10:19:53Z) - OrthoReg: Improving Graph-regularized MLPs via Orthogonality
Regularization [66.30021126251725]
Graph Neural Networks (GNNs) are currently dominating in modeling graphstructure data.
Graph-regularized networks (GR-MLPs) implicitly inject the graph structure information into model weights, while their performance can hardly match that of GNNs in most tasks.
We show that GR-MLPs suffer from dimensional collapse, a phenomenon in which the largest a few eigenvalues dominate the embedding space.
We propose OrthoReg, a novel GR-MLP model to mitigate the dimensional collapse issue.
arXiv Detail & Related papers (2023-01-31T21:20:48Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - T-LoHo: A Bayesian Regularization Model for Structured Sparsity and
Smoothness on Graphs [0.0]
In graph-structured data, structured sparsity and smoothness tend to cluster together.
We propose a new prior for high dimensional parameters with graphical relations.
We use it to detect structured sparsity and smoothness simultaneously.
arXiv Detail & Related papers (2021-07-06T10:10:03Z) - Regularization of Mixture Models for Robust Principal Graph Learning [0.0]
A regularized version of Mixture Models is proposed to learn a principal graph from a distribution of $D$-dimensional data points.
Parameters of the model are iteratively estimated through an Expectation-Maximization procedure.
arXiv Detail & Related papers (2021-06-16T18:00:02Z) - Nonparametric Trace Regression in High Dimensions via Sign Series
Representation [13.37650464374017]
We develop a framework for nonparametric trace regression models via structured sign series representations of high dimensional functions.
In the context of matrix completion, our framework leads to a substantially richer model based on what we coin as the "sign rank" of a matrix.
arXiv Detail & Related papers (2021-05-04T22:20:00Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.