Patch Based Classification of Remote Sensing Data: A Comparison of
2D-CNN, SVM and NN Classifiers
- URL: http://arxiv.org/abs/2006.11767v1
- Date: Sun, 21 Jun 2020 11:07:37 GMT
- Title: Patch Based Classification of Remote Sensing Data: A Comparison of
2D-CNN, SVM and NN Classifiers
- Authors: Mahesh Pal, Akshay, Himanshu Rohilla and B. Charan Teja
- Abstract summary: We compare performance of patch based SVM and NN with that of a deep learning algorithms comprising of 2D-CNN and fully connected layers.
Results with both datasets suggest the effectiveness of patch based SVM and NN.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Pixel based algorithms including back propagation neural networks (NN) and
support vector machines (SVM) have been widely used for remotely sensed image
classifications. Within last few years, deep learning based image classifier
like convolution neural networks (2D-CNN) are becoming popular alternatives to
these classifiers. In this paper, we compare performance of patch based SVM and
NN with that of a deep learning algorithms comprising of 2D-CNN and fully
connected layers. Similar to CNN which utilise image patches to derive features
for further classification, we propose to use patches as an input in place of
individual pixel with both SVM and NN classifiers. Two datasets, one
multispectral and other hyperspectral data was used to compare the performance
of different classifiers. Results with both datasets suggest the effectiveness
of patch based SVM and NN classifiers in comparison to state of art 2D-CNN
classifier.
Related papers
- Fuzzy Convolution Neural Networks for Tabular Data Classification [0.0]
Convolutional neural networks (CNNs) have attracted a great deal of attention due to their remarkable performance in various domains.
In this paper, we propose a novel framework fuzzy convolution neural network (FCNN) tailored specifically for tabular data.
arXiv Detail & Related papers (2024-06-04T20:33:35Z) - NOAH: Learning Pairwise Object Category Attentions for Image
Classification [26.077836657775403]
Non-glObal Attentive Head (NOAH) is a new form of dot-product attention called pairwise object category attention (POCA)
As a drop-in design, NOAH can be easily used to replace existing heads of various types of DNNs.
arXiv Detail & Related papers (2024-02-04T07:19:40Z) - RankDNN: Learning to Rank for Few-shot Learning [70.49494297554537]
This paper introduces a new few-shot learning pipeline that casts relevance ranking for image retrieval as binary ranking relation classification.
It provides a new perspective on few-shot learning and is complementary to state-of-the-art methods.
arXiv Detail & Related papers (2022-11-28T13:59:31Z) - Large-Margin Representation Learning for Texture Classification [67.94823375350433]
This paper presents a novel approach combining convolutional layers (CLs) and large-margin metric learning for training supervised models on small datasets for texture classification.
The experimental results on texture and histopathologic image datasets have shown that the proposed approach achieves competitive accuracy with lower computational cost and faster convergence when compared to equivalent CNNs.
arXiv Detail & Related papers (2022-06-17T04:07:45Z) - Classification of Hyperspectral Images by Using Spectral Data and Fully
Connected Neural Network [0.0]
classification success over 90% has been achieved for hyperspectral images.
In this study, hyperspectral images of Indian pines, Salinas, Pavia centre, Pavia university and Botswana are classified.
An average accuracy of 97.5% is achieved for the test sets of all hyperspectral images.
arXiv Detail & Related papers (2022-01-08T12:45:48Z) - Deep ensembles in bioimage segmentation [74.01883650587321]
In this work, we propose an ensemble of convolutional neural networks (CNNs)
In ensemble methods, many different models are trained and then used for classification, the ensemble aggregates the outputs of the single classifiers.
The proposed ensemble is implemented by combining different backbone networks using the DeepLabV3+ and HarDNet environment.
arXiv Detail & Related papers (2021-12-24T05:54:21Z) - Rethinking Nearest Neighbors for Visual Classification [56.00783095670361]
k-NN is a lazy learning method that aggregates the distance between the test image and top-k neighbors in a training set.
We adopt k-NN with pre-trained visual representations produced by either supervised or self-supervised methods in two steps.
Via extensive experiments on a wide range of classification tasks, our study reveals the generality and flexibility of k-NN integration.
arXiv Detail & Related papers (2021-12-15T20:15:01Z) - Overhead-MNIST: Machine Learning Baselines for Image Classification [0.0]
Twenty-three machine learning algorithms were trained then scored to establish baseline comparison metrics.
The Overhead-MNIST dataset is a collection of satellite images similar in style to the ubiquitous MNIST hand-written digits.
We present results for the overall best performing algorithm as a baseline for edge deployability and future performance improvement.
arXiv Detail & Related papers (2021-07-01T13:30:39Z) - Deep Features for training Support Vector Machine [16.795405355504077]
This paper develops a generic computer vision system based on features extracted from trained CNNs.
Multiple learned features are combined into a single structure to work on different image classification tasks.
arXiv Detail & Related papers (2021-04-08T03:13:09Z) - Spatial Dependency Networks: Neural Layers for Improved Generative Image
Modeling [79.15521784128102]
We introduce a novel neural network for building image generators (decoders) and apply it to variational autoencoders (VAEs)
In our spatial dependency networks (SDNs), feature maps at each level of a deep neural net are computed in a spatially coherent way.
We show that augmenting the decoder of a hierarchical VAE by spatial dependency layers considerably improves density estimation.
arXiv Detail & Related papers (2021-03-16T07:01:08Z) - A Systematic Evaluation: Fine-Grained CNN vs. Traditional CNN
Classifiers [54.996358399108566]
We investigate the performance of the landmark general CNN classifiers, which presented top-notch results on large scale classification datasets.
We compare it against state-of-the-art fine-grained classifiers.
We show an extensive evaluation on six datasets to determine whether the fine-grained classifier is able to elevate the baseline in their experiments.
arXiv Detail & Related papers (2020-03-24T23:49:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.