Spatio-Temporal driven Attention Graph Neural Network with Block
Adjacency matrix (STAG-NN-BA)
- URL: http://arxiv.org/abs/2303.14322v1
- Date: Sat, 25 Mar 2023 01:26:50 GMT
- Title: Spatio-Temporal driven Attention Graph Neural Network with Block
Adjacency matrix (STAG-NN-BA)
- Authors: U. Nazir, W. Islam, M. Taj
- Abstract summary: We propose a Graph Neural Network architecture for spatial andtemporal classification using satellite imagery.
Instead of classifying each pixel, we propose a method based on Simple Linear Iterative Clustering (SLIC) image segmentation and Graph Attention GAT.
The code and dataset will be made public via our GitHub repository.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Despite the recent advances in deep neural networks, standard convolutional
kernels limit the applications of these networks to the Euclidean domain only.
Considering the geodesic nature of the measurement of the earth's surface,
remote sensing is one such area that can benefit from non-Euclidean and
spherical domains. For this purpose, we propose a novel Graph Neural Network
architecture for spatial and spatio-temporal classification using satellite
imagery. We propose a hybrid attention method to learn the relative importance
of irregular neighbors in remote sensing data. Instead of classifying each
pixel, we propose a method based on Simple Linear Iterative Clustering (SLIC)
image segmentation and Graph Attention GAT. The superpixels obtained from SLIC
become the nodes of our Graph Convolution Network (GCN). We then construct a
region adjacency graph (RAG) where each superpixel is connected to every other
adjacent superpixel in the image, enabling information to propagate globally.
Finally, we propose a Spatially driven Attention Graph Neural Network (SAG-NN)
to classify each RAG. We also propose an extension to our SAG-NN for
spatio-temporal data. Unlike regular grids of pixels in images, superpixels are
irregular in nature and cannot be used to create spatio-temporal graphs. We
introduce temporal bias by combining unconnected RAGs from each image into one
supergraph. This is achieved by introducing block adjacency matrices resulting
in novel Spatio-Temporal driven Attention Graph Neural Network with Block
Adjacency matrix (STAG-NN-BA). We evaluate our proposed methods on two remote
sensing datasets namely Asia14 and C2D2. In comparison with both non-graph and
graph-based approaches our SAG-NN and STAG-NN-BA achieved superior accuracy on
all the datasets while incurring less computation cost. The code and dataset
will be made public via our GitHub repository.
Related papers
- Representation Learning on Heterophilic Graph with Directional
Neighborhood Attention [8.493802098034255]
Graph Attention Network (GAT) is one of the most popular Graph Neural Network (GNN) architecture.
GAT lacks the ability to capture long-range and global graph information, leading to unsatisfactory performance on some datasets.
We propose Directional Graph Attention Network (DGAT) to combine the feature-based attention with the global directional information extracted from the graph topology.
arXiv Detail & Related papers (2024-03-03T10:59:16Z) - EGRC-Net: Embedding-induced Graph Refinement Clustering Network [66.44293190793294]
We propose a novel graph clustering network called Embedding-Induced Graph Refinement Clustering Network (EGRC-Net)
EGRC-Net effectively utilizes the learned embedding to adaptively refine the initial graph and enhance the clustering performance.
Our proposed methods consistently outperform several state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-19T09:08:43Z) - Unsupervised Image Semantic Segmentation through Superpixels and Graph
Neural Networks [6.123324869194195]
Unsupervised image segmentation is an important task in many real-world scenarios where labelled data is of scarce availability.
We propose a novel approach that harnesses recent advances in unsupervised learning using a combination of Mutual Information Maximization (MIM), Neural Superpixel and Graph Neural Networks (GNNs) in an end-to-end manner.
arXiv Detail & Related papers (2022-10-21T08:35:18Z) - Multi-Level Graph Convolutional Network with Automatic Graph Learning
for Hyperspectral Image Classification [63.56018768401328]
We propose a Multi-level Graph Convolutional Network (GCN) with Automatic Graph Learning method (MGCN-AGL) for HSI classification.
By employing attention mechanism to characterize the importance among spatially neighboring regions, the most relevant information can be adaptively incorporated to make decisions.
Our MGCN-AGL encodes the long range dependencies among image regions based on the expressive representations that have been produced at local level.
arXiv Detail & Related papers (2020-09-19T09:26:20Z) - TreeRNN: Topology-Preserving Deep GraphEmbedding and Learning [24.04035265351755]
We study the methods to transfer the graphs into trees so that explicit orders are learned to direct the feature integration from local to global.
To best learn the patterns from the graph-tree-images, we propose TreeRNN, a 2D RNN architecture that recurrently integrates the image pixels by rows and columns to help classify the graph categories.
arXiv Detail & Related papers (2020-06-21T15:22:24Z) - Isometric Graph Neural Networks [5.306334746787569]
We propose a technique to learn Isometric Graph Neural Networks (IGNN)
IGNN requires changing the input representation space and loss function to enable any GNN algorithm to generate representations that reflect distances between nodes.
We observe a consistent and substantial improvement as high as 400% in Kendall's Tau (KT)
arXiv Detail & Related papers (2020-06-16T22:51:13Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - High-Order Information Matters: Learning Relation and Topology for
Occluded Person Re-Identification [84.43394420267794]
We propose a novel framework by learning high-order relation and topology information for discriminative features and robust alignment.
Our framework significantly outperforms state-of-the-art by6.5%mAP scores on Occluded-Duke dataset.
arXiv Detail & Related papers (2020-03-18T12:18:35Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Superpixel Image Classification with Graph Attention Networks [4.714325419968082]
This paper presents a methodology for image classification using Graph Neural Network (GNN) models.
We transform the input images into region adjacency graphs (RAGs), in which regions are superpixels and edges connect neighboring superpixels.
Experiments suggest that Graph Attention Networks (GATs), which combine graph convolutions with self-attention mechanisms, outperforms other GNN models.
arXiv Detail & Related papers (2020-02-13T14:52:32Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.