Equivariance-bridged SO(2)-Invariant Representation Learning using Graph
Convolutional Network
- URL: http://arxiv.org/abs/2106.09996v1
- Date: Fri, 18 Jun 2021 08:37:45 GMT
- Title: Equivariance-bridged SO(2)-Invariant Representation Learning using Graph
Convolutional Network
- Authors: Sungwon Hwang, Hyungtae Lim and Hyun Myung
- Abstract summary: Training a Convolutional Neural Network (CNN) to be robust against rotation has mostly been done with data augmentation.
This paper highlights to encourage less dependence on data augmentation by achieving structural rotational invariance of a network.
Our method achieves the state-of-the-art image classification performance on rotated MNIST and CIFAR-10 images.
- Score: 0.1657441317977376
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Training a Convolutional Neural Network (CNN) to be robust against rotation
has mostly been done with data augmentation. In this paper, another progressive
vision of research direction is highlighted to encourage less dependence on
data augmentation by achieving structural rotational invariance of a network.
The deep equivariance-bridged SO(2) invariant network is proposed to echo such
vision. First, Self-Weighted Nearest Neighbors Graph Convolutional Network
(SWN-GCN) is proposed to implement Graph Convolutional Network (GCN) on the
graph representation of an image to acquire rotationally equivariant
representation, as GCN is more suitable for constructing deeper network than
spectral graph convolution-based approaches. Then, invariant representation is
eventually obtained with Global Average Pooling (GAP), a permutation-invariant
operation suitable for aggregating high-dimensional representations, over the
equivariant set of vertices retrieved from SWN-GCN. Our method achieves the
state-of-the-art image classification performance on rotated MNIST and CIFAR-10
images, where the models are trained with a non-augmented dataset only.
Quantitative validations over invariance of the representations also
demonstrate strong invariance of deep representations of SWN-GCN over
rotations.
Related papers
- Integrating Graph Neural Networks with Scattering Transform for Anomaly Detection [0.0]
We present two novel methods in Network Intrusion Detection Systems (NIDS) using Graph Neural Networks (GNNs)
The first approach, Scattering Transform with E-GraphSAGE (STEG), utilizes the scattering transform to conduct multi-resolution analysis of edge feature vectors.
The second approach improves node representation by initiating with Node2Vec, diverging from standard methods of using uniform values.
arXiv Detail & Related papers (2024-04-16T00:02:12Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Revisiting Transformation Invariant Geometric Deep Learning: Are Initial
Representations All You Need? [80.86819657126041]
We show that transformation-invariant and distance-preserving initial representations are sufficient to achieve transformation invariance.
Specifically, we realize transformation-invariant and distance-preserving initial point representations by modifying multi-dimensional scaling.
We prove that TinvNN can strictly guarantee transformation invariance, being general and flexible enough to be combined with the existing neural networks.
arXiv Detail & Related papers (2021-12-23T03:52:33Z) - Orthogonal Graph Neural Networks [53.466187667936026]
Graph neural networks (GNNs) have received tremendous attention due to their superiority in learning node representations.
stacking more convolutional layers significantly decreases the performance of GNNs.
We propose a novel Ortho-GConv, which could generally augment the existing GNN backbones to stabilize the model training and improve the model's generalization performance.
arXiv Detail & Related papers (2021-09-23T12:39:01Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - SLGCN: Structure Learning Graph Convolutional Networks for Graphs under
Heterophily [5.619890178124606]
We propose a structure learning graph convolutional networks (SLGCNs) to alleviate the issue from two aspects.
Specifically, we design a efficient-spectral-clustering with anchors (ESC-ANCH) approach to efficiently aggregate feature representations from all similar nodes.
Experimental results on a wide range of benchmark datasets illustrate that the proposed SLGCNs outperform the stat-of-the-art GNN counterparts.
arXiv Detail & Related papers (2021-05-28T13:00:38Z) - Self-Supervised Graph Representation Learning via Topology
Transformations [61.870882736758624]
We present the Topology Transformation Equivariant Representation learning, a general paradigm of self-supervised learning for node representations of graph data.
In experiments, we apply the proposed model to the downstream node and graph classification tasks, and results show that the proposed method outperforms the state-of-the-art unsupervised approaches.
arXiv Detail & Related papers (2021-05-25T06:11:03Z) - Action Recognition with Kernel-based Graph Convolutional Networks [14.924672048447338]
Learning graph convolutional networks (GCNs) aims at generalizing deep learning to arbitrary non-regular domains.
We introduce a novel GCN framework that achieves spatial graph convolution in a reproducing kernel Hilbert space (RKHS)
The particularity of our GCN model also resides in its ability to achieve convolutions without explicitly realigning nodes in the receptive fields of the learned graph filters with those of the input graphs.
arXiv Detail & Related papers (2020-12-28T11:02:51Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Group Equivariant Generative Adversarial Networks [7.734726150561089]
In this work, we explicitly incorporate inductive symmetry priors into the network architectures via group-equivariant convolutional networks.
Group-convariants have higher expressive power with fewer samples and lead to better gradient feedback between generator and discriminator.
arXiv Detail & Related papers (2020-05-04T17:38:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.