Spatio-temporal Crop Classification On Volumetric Data
- URL: http://arxiv.org/abs/2103.10050v1
- Date: Thu, 18 Mar 2021 07:13:53 GMT
- Title: Spatio-temporal Crop Classification On Volumetric Data
- Authors: Muhammad Usman Qadeer, Salar Saeed, Murtaza Taj and Abubakr Muhammad
- Abstract summary: Large-area crop classification using multi-spectral imagery is a widely studied problem for several decades.
Deep convolutional neural networks (DCNN) have been proposed, but they only achieved results comparable with Random Forest.
In this work, we present a novel CNN based architecture for large-area crop classification.
- Score: 3.2880869992413246
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Large-area crop classification using multi-spectral imagery is a widely
studied problem for several decades and is generally addressed using classical
Random Forest classifier. Recently, deep convolutional neural networks (DCNN)
have been proposed. However, these methods only achieved results comparable
with Random Forest. In this work, we present a novel CNN based architecture for
large-area crop classification. Our methodology combines both spatio-temporal
analysis via 3D CNN as well as temporal analysis via 1D CNN. We evaluated the
efficacy of our approach on Yolo and Imperial county benchmark datasets. Our
combined strategy outperforms both classical as well as recent DCNN based
methods in terms of classification accuracy by 2% while maintaining a minimum
number of parameters and the lowest inference time.
Related papers
- Tree species classification at the pixel-level using deep learning and multispectral time series in an imbalanced context [0.0]
This paper investigates tree species classification using Sentinel-2 multispectral satellite image time-series.
It shows that the use of deep learning models can lead to a significant improvement in classification results.
arXiv Detail & Related papers (2024-08-05T13:44:42Z) - Time Elastic Neural Networks [2.1756081703276]
We introduce and detail an atypical neural network architecture, called time elastic neural network (teNN)
The novelty compared to classical neural network architecture is that it explicitly incorporates time warping ability.
We demonstrate that, during the training process, the teNN succeeds in reducing the number of neurons required within each cell.
arXiv Detail & Related papers (2024-05-27T09:01:30Z) - On the rates of convergence for learning with convolutional neural networks [9.772773527230134]
We study approximation and learning capacities of convolutional neural networks (CNNs) with one-side zero-padding and multiple channels.
We derive convergence rates for estimators based on CNNs in many learning problems.
It is also shown that the obtained rates for classification are minimax optimal in some common settings.
arXiv Detail & Related papers (2024-03-25T06:42:02Z) - A Proximal Algorithm for Network Slimming [2.8148957592979427]
A popular channel pruning method for convolutional neural networks (CNNs) uses subgradient descent to train CNNs.
We develop an alternative algorithm called proximal NS to train CNNs towards sparse, accurate structures.
Our experiments demonstrate that after one round of training, proximal NS yields a CNN with competitive accuracy and compression.
arXiv Detail & Related papers (2023-07-02T23:34:12Z) - Compound Batch Normalization for Long-tailed Image Classification [77.42829178064807]
We propose a compound batch normalization method based on a Gaussian mixture.
It can model the feature space more comprehensively and reduce the dominance of head classes.
The proposed method outperforms existing methods on long-tailed image classification.
arXiv Detail & Related papers (2022-12-02T07:31:39Z) - Rethinking Nearest Neighbors for Visual Classification [56.00783095670361]
k-NN is a lazy learning method that aggregates the distance between the test image and top-k neighbors in a training set.
We adopt k-NN with pre-trained visual representations produced by either supervised or self-supervised methods in two steps.
Via extensive experiments on a wide range of classification tasks, our study reveals the generality and flexibility of k-NN integration.
arXiv Detail & Related papers (2021-12-15T20:15:01Z) - An Uncertainty-Driven GCN Refinement Strategy for Organ Segmentation [53.425900196763756]
We propose a segmentation refinement method based on uncertainty analysis and graph convolutional networks.
We employ the uncertainty levels of the convolutional network in a particular input volume to formulate a semi-supervised graph learning problem.
We show that our method outperforms the state-of-the-art CRF refinement method by improving the dice score by 1% for the pancreas and 2% for spleen.
arXiv Detail & Related papers (2020-12-06T18:55:07Z) - Classification of Polarimetric SAR Images Using Compact Convolutional
Neural Networks [24.553598498985796]
A novel and systematic classification framework is proposed for the classification of PolSAR images.
It is based on a compact and adaptive implementation of CNNs using a sliding-window classification approach.
The proposed approach can perform classification using smaller window sizes than deep CNNs.
arXiv Detail & Related papers (2020-11-10T17:09:11Z) - Forest R-CNN: Large-Vocabulary Long-Tailed Object Detection and Instance
Segmentation [75.93960390191262]
We exploit prior knowledge of the relations among object categories to cluster fine-grained classes into coarser parent classes.
We propose a simple yet effective resampling method, NMS Resampling, to re-balance the data distribution.
Our method, termed as Forest R-CNN, can serve as a plug-and-play module being applied to most object recognition models.
arXiv Detail & Related papers (2020-08-13T03:52:37Z) - Equalization Loss for Long-Tailed Object Recognition [109.91045951333835]
State-of-the-art object detection methods still perform poorly on large vocabulary and long-tailed datasets.
We propose a simple but effective loss, named equalization loss, to tackle the problem of long-tailed rare categories.
Our method achieves AP gains of 4.1% and 4.8% for the rare and common categories on the challenging LVIS benchmark.
arXiv Detail & Related papers (2020-03-11T09:14:53Z) - Approximation and Non-parametric Estimation of ResNet-type Convolutional
Neural Networks [52.972605601174955]
We show a ResNet-type CNN can attain the minimax optimal error rates in important function classes.
We derive approximation and estimation error rates of the aformentioned type of CNNs for the Barron and H"older classes.
arXiv Detail & Related papers (2019-03-24T19:42:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.