Time Gated Convolutional Neural Networks for Crop Classification
- URL: http://arxiv.org/abs/2206.09756v1
- Date: Mon, 20 Jun 2022 13:05:29 GMT
- Title: Time Gated Convolutional Neural Networks for Crop Classification
- Authors: Longlong Weng, Yashu Kang, Kezhao Jiang, Chunlei Chen
- Abstract summary: State-of-the-art framework, Time Gated Convolutional Neural Network (TGCNN)
TGCNN takes advantage of temporal information and gating mechanisms for the crop classification problem.
Our experiments demonstrate that TGCNN is advantageous in this earth observation time series classification task.
- Score: 0.9176056742068814
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presented a state-of-the-art framework, Time Gated Convolutional
Neural Network (TGCNN) that takes advantage of temporal information and gating
mechanisms for the crop classification problem. Besides, several vegetation
indices were constructed to expand dimensions of input data to take advantage
of spectral information. Both spatial (channel-wise) and temporal (step-wise)
correlation are considered in TGCNN. Specifically, our preliminary analysis
indicates that step-wise information is of greater importance in this data set.
Lastly, the gating mechanism helps capture high-order relationship. Our TGCNN
solution achieves $0.973$ F1 score, $0.977$ AUC ROC and $0.948$ IoU,
respectively. In addition, it outperforms three other benchmarks in different
local tasks (Kenya, Brazil and Togo). Overall, our experiments demonstrate that
TGCNN is advantageous in this earth observation time series classification
task.
Related papers
- AttDiCNN: Attentive Dilated Convolutional Neural Network for Automatic Sleep Staging using Visibility Graph and Force-directed Layout [0.0]
We present an automated sleep stage classifier termed the Attentive Dilated Convolutional Neural Network (AttDiCNN)
We employ a force-directed layout based on the visibility graph to capture the most significant information from the EEG signals.
The network consists of three compositors: the Localized Spatial Feature Extraction Network (LSFE), the Spatio-Temporal-Temporal Long Retention Network (S2TLR), and the Global Averaging Attention Network (G2A)
arXiv Detail & Related papers (2024-08-21T06:35:50Z) - Time Elastic Neural Networks [2.1756081703276]
We introduce and detail an atypical neural network architecture, called time elastic neural network (teNN)
The novelty compared to classical neural network architecture is that it explicitly incorporates time warping ability.
We demonstrate that, during the training process, the teNN succeeds in reducing the number of neurons required within each cell.
arXiv Detail & Related papers (2024-05-27T09:01:30Z) - DCNN: Dual Cross-current Neural Networks Realized Using An Interactive Deep Learning Discriminator for Fine-grained Objects [48.65846477275723]
This study proposes novel dual-current neural networks (DCNN) to improve the accuracy of fine-grained image classification.
The main novel design features for constructing a weakly supervised learning backbone model DCNN include (a) extracting heterogeneous data, (b) keeping the feature map resolution unchanged, (c) expanding the receptive field, and (d) fusing global representations and local features.
arXiv Detail & Related papers (2024-05-07T07:51:28Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Towards Efficient Graph Convolutional Networks for Point Cloud Handling [181.59146413326056]
We aim at improving the computational efficiency of graph convolutional networks (GCNs) for learning on point clouds.
A series of experiments show that optimized networks have reduced computational complexity, decreased memory consumption, and accelerated inference speed.
arXiv Detail & Related papers (2021-04-12T17:59:16Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [85.0332394224503]
We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
arXiv Detail & Related papers (2020-07-05T08:16:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.