Convolutional neural networks for classification and regression analysis
of one-dimensional spectral data
- URL: http://arxiv.org/abs/2005.07530v1
- Date: Fri, 15 May 2020 13:20:05 GMT
- Title: Convolutional neural networks for classification and regression analysis
of one-dimensional spectral data
- Authors: Ine L. Jernelv, Dag Roar Hjelme, Yuji Matsuura, Astrid Aksnes
- Abstract summary: Convolutional neural networks (CNNs) are widely used for image recognition and text analysis.
The performance of a CNN was investigated for classification and regression analysis of spectral data.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Convolutional neural networks (CNNs) are widely used for image recognition
and text analysis, and have been suggested for application on one-dimensional
data as a way to reduce the need for pre-processing steps. Pre-processing is an
integral part of multivariate analysis, but determination of the optimal
pre-processing methods can be time-consuming due to the large number of
available methods. In this work, the performance of a CNN was investigated for
classification and regression analysis of spectral data. The CNN was compared
with various other chemometric methods, including support vector machines
(SVMs) for classification and partial least squares regression (PLSR) for
regression analysis. The comparisons were made both on raw data, and on data
that had gone through pre-processing and/or feature selection methods. The
models were used on spectral data acquired with methods based on near-infrared,
mid-infrared, and Raman spectroscopy. For the classification datasets the
models were evaluated based on the percentage of correctly classified
observations, while for regression analysis the models were assessed based on
the coefficient of determination (R$^2$). Our results show that CNNs can
outperform standard chemometric methods, especially for classification tasks
where no pre-processing is used. However, both CNN and the standard chemometric
methods see improved performance when proper pre-processing and feature
selection methods are used. These results demonstrate some of the capabilities
and limitations of CNNs used on one-dimensional data.
Related papers
- Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Post-training Model Quantization Using GANs for Synthetic Data
Generation [57.40733249681334]
We investigate the use of synthetic data as a substitute for the calibration with real data for the quantization method.
We compare the performance of models quantized using data generated by StyleGAN2-ADA and our pre-trained DiStyleGAN, with quantization using real data and an alternative data generation method based on fractal images.
arXiv Detail & Related papers (2023-05-10T11:10:09Z) - Learning Partial Correlation based Deep Visual Representation for Image
Classification [61.0532370259644]
We formulate sparse inverse covariance estimation (SICE) as a novel structured layer of CNN.
Our work obtains a partial correlation based deep visual representation and mitigates the small sample problem.
Experiments show the efficacy and superior classification performance of our model.
arXiv Detail & Related papers (2023-04-23T10:09:01Z) - Large-Margin Representation Learning for Texture Classification [67.94823375350433]
This paper presents a novel approach combining convolutional layers (CLs) and large-margin metric learning for training supervised models on small datasets for texture classification.
The experimental results on texture and histopathologic image datasets have shown that the proposed approach achieves competitive accuracy with lower computational cost and faster convergence when compared to equivalent CNNs.
arXiv Detail & Related papers (2022-06-17T04:07:45Z) - Lost Vibration Test Data Recovery Using Convolutional Neural Network: A
Case Study [0.0]
This paper proposes a CNN algorithm for the Alamosa Canyon Bridge as a real structure.
Three different CNN models were considered to predict one and two malfunctioned sensors.
The accuracy of the model was increased by adding a convolutional layer.
arXiv Detail & Related papers (2022-04-11T23:24:03Z) - Dynamically-Scaled Deep Canonical Correlation Analysis [77.34726150561087]
Canonical Correlation Analysis (CCA) is a method for feature extraction of two views by finding maximally correlated linear projections of them.
We introduce a novel dynamic scaling method for training an input-dependent canonical correlation model.
arXiv Detail & Related papers (2022-03-23T12:52:49Z) - Photometric Redshift Estimation with Convolutional Neural Networks and
Galaxy Images: A Case Study of Resolving Biases in Data-Driven Methods [0.0]
We investigate two major forms of biases, i.e., class-dependent residuals and mode collapse, in a case study of estimating photometric redshifts.
We propose a set of consecutive steps for resolving the two biases based on CNN models.
Experiments show that our methods possess a better capability in controlling biases compared to benchmark methods.
arXiv Detail & Related papers (2022-02-21T02:59:33Z) - Model Doctor: A Simple Gradient Aggregation Strategy for Diagnosing and
Treating CNN Classifiers [33.82339346293966]
Convolutional Neural Network (CNN) has achieved excellent performance in the classification task.
It is widely known that CNN is deemed as a 'black-box', which is hard for understanding the prediction mechanism.
We propose the first completely automatic model diagnosing and treating tool, termed as Model Doctor.
arXiv Detail & Related papers (2021-12-09T14:05:00Z) - Examining and Mitigating Kernel Saturation in Convolutional Neural
Networks using Negative Images [0.8594140167290097]
We analyze the effect of convolutional kernel saturation in CNNs.
We propose a simple data augmentation technique to mitigate saturation and increase classification accuracy, by supplementing negative images to the training dataset.
Our results show that CNNs are indeed susceptible to convolutional kernel saturation and that supplementing negative images to the training dataset can offer a statistically significant increase in classification accuracies.
arXiv Detail & Related papers (2021-05-10T06:06:49Z) - Transfer Learning with Convolutional Networks for Atmospheric Parameter
Retrieval [14.131127382785973]
The Infrared Atmospheric Sounding Interferometer (IASI) on board the MetOp satellite series provides important measurements for Numerical Weather Prediction (NWP)
Retrieving accurate atmospheric parameters from the raw data provided by IASI is a large challenge, but necessary in order to use the data in NWP models.
We show how features extracted from the IASI data by a CNN trained to predict a physical variable can be used as inputs to another statistical method designed to predict a different physical variable at low altitude.
arXiv Detail & Related papers (2020-12-09T09:28:42Z) - ACDC: Weight Sharing in Atom-Coefficient Decomposed Convolution [57.635467829558664]
We introduce a structural regularization across convolutional kernels in a CNN.
We show that CNNs now maintain performance with dramatic reduction in parameters and computations.
arXiv Detail & Related papers (2020-09-04T20:41:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.