Graph Convolutional Network with Generalized Factorized Bilinear
Aggregation
- URL: http://arxiv.org/abs/2107.11666v1
- Date: Sat, 24 Jul 2021 17:57:06 GMT
- Title: Graph Convolutional Network with Generalized Factorized Bilinear
Aggregation
- Authors: Hao Zhu, Piotr Koniusz
- Abstract summary: We propose a novel generalization of Factorized Bilinear (FB) layer to model the feature interactions in Graph Convolutional Networks (GCNs)
Our experimental results on multiple datasets demonstrate that the GFB-GCN is competitive with other methods for text classification.
- Score: 31.674649135019386
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Although Graph Convolutional Networks (GCNs) have demonstrated their power in
various applications, the graph convolutional layers, as the most important
component of GCN, are still using linear transformations and a simple pooling
step. In this paper, we propose a novel generalization of Factorized Bilinear
(FB) layer to model the feature interactions in GCNs. FB performs two
matrix-vector multiplications, that is, the weight matrix is multiplied with
the outer product of the vector of hidden features from both sides. However,
the FB layer suffers from the quadratic number of coefficients, overfitting and
the spurious correlations due to correlations between channels of hidden
representations that violate the i.i.d. assumption. Thus, we propose a compact
FB layer by defining a family of summarizing operators applied over the
quadratic term. We analyze proposed pooling operators and motivate their use.
Our experimental results on multiple datasets demonstrate that the GFB-GCN is
competitive with other methods for text classification.
Related papers
- Understanding the Effect of GCN Convolutions in Regression Tasks [8.299692647308323]
Graph Convolutional Networks (GCNs) have become a pivotal method in machine learning for modeling functions over graphs.
This paper provides a formal analysis of the impact of convolution operators on regression tasks over homophilic networks.
arXiv Detail & Related papers (2024-10-26T04:19:52Z) - Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Graph Edge Representation via Tensor Product Graph Convolutional Representation [23.021660625582854]
This paper defines an effective convolution operator on graphs with edge features which is named as Product Graph Convolution (TPGC)
It provides a complementary model to traditional graph convolutions (GCs) to address the more general graph data analysis with both node and edge features.
Experimental results on several graph learning tasks demonstrate the effectiveness of the proposed TPGC.
arXiv Detail & Related papers (2024-06-21T03:21:26Z) - Binary Graph Convolutional Network with Capacity Exploration [58.99478502486377]
We propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node attributes.
Our Bi-GCN can reduce the memory consumption by an average of 31x for both the network parameters and input data, and accelerate the inference speed by an average of 51x.
arXiv Detail & Related papers (2022-10-24T12:05:17Z) - Graph Spectral Embedding using the Geodesic Betweeness Centrality [76.27138343125985]
We introduce the Graph Sylvester Embedding (GSE), an unsupervised graph representation of local similarity, connectivity, and global structure.
GSE uses the solution of the Sylvester equation to capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2022-05-07T04:11:23Z) - Factorizable Graph Convolutional Networks [90.59836684458905]
We introduce a novel graph convolutional network (GCN) that explicitly disentangles intertwined relations encoded in a graph.
FactorGCN takes a simple graph as input, and disentangles it into several factorized graphs.
We evaluate the proposed FactorGCN both qualitatively and quantitatively on the synthetic and real-world datasets.
arXiv Detail & Related papers (2020-10-12T03:01:40Z) - Dual-constrained Deep Semi-Supervised Coupled Factorization Network with
Enriched Prior [80.5637175255349]
We propose a new enriched prior based Dual-constrained Deep Semi-Supervised Coupled Factorization Network, called DS2CF-Net.
To ex-tract hidden deep features, DS2CF-Net is modeled as a deep-structure and geometrical structure-constrained neural network.
Our network can obtain state-of-the-art performance for representation learning and clustering.
arXiv Detail & Related papers (2020-09-08T13:10:21Z) - DiffGCN: Graph Convolutional Networks via Differential Operators and
Algebraic Multigrid Pooling [7.23389716633927]
Graph Convolutional Networks (GCNs) have shown to be effective in handling unordered data like point clouds and meshes.
We propose novel approaches for graph convolution, pooling and unpooling, inspired from finite differences and algebraic multigrid frameworks.
arXiv Detail & Related papers (2020-06-07T11:08:37Z) - Revisiting Graph based Collaborative Filtering: A Linear Residual Graph
Convolutional Network Approach [55.44107800525776]
Graph Convolutional Networks (GCNs) are state-of-the-art graph based representation learning models.
In this paper, we revisit GCN based Collaborative Filtering (CF) based Recommender Systems (RS)
We show that removing non-linearities would enhance recommendation performance, consistent with the theories in simple graph convolutional networks.
We propose a residual network structure that is specifically designed for CF with user-item interaction modeling.
arXiv Detail & Related papers (2020-01-28T04:41:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.