Detecting Aedes Aegypti Mosquitoes through Audio Classification with
Convolutional Neural Networks
- URL: http://arxiv.org/abs/2008.09024v1
- Date: Wed, 19 Aug 2020 00:26:21 GMT
- Title: Detecting Aedes Aegypti Mosquitoes through Audio Classification with
Convolutional Neural Networks
- Authors: Marcelo Schreiber Fernandes, Weverton Cordeiro, Mariana
Recamonde-Mendoza
- Abstract summary: A strategy to raise community awareness regarding mosquito proliferation is building a live map of mosquito incidences using smartphone apps and crowdsourcing.
In this paper, we explore the possibility of identifying Aedes aegypti mosquitoes using machine learning techniques and audio analysis captured from commercially available smartphones.
- Score: 1.0323063834827415
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The incidence of mosquito-borne diseases is significant in under-developed
regions, mostly due to the lack of resources to implement aggressive control
measurements against mosquito proliferation. A potential strategy to raise
community awareness regarding mosquito proliferation is building a live map of
mosquito incidences using smartphone apps and crowdsourcing. In this paper, we
explore the possibility of identifying Aedes aegypti mosquitoes using machine
learning techniques and audio analysis captured from commercially available
smartphones. In summary, we downsampled Aedes aegypti wingbeat recordings and
used them to train a convolutional neural network (CNN) through supervised
learning. As a feature, we used the recording spectrogram to represent the
mosquito wingbeat frequency over time visually. We trained and compared three
classifiers: a binary, a multiclass, and an ensemble of binary classifiers. In
our evaluation, the binary and ensemble models achieved accuracy of 97.65%
($\pm$ 0.55) and 94.56% ($\pm$ 0.77), respectively, whereas the multiclass had
an accuracy of 78.12% ($\pm$ 2.09). The best sensitivity was observed in the
ensemble approach (96.82% $\pm$ 1.62), followed by the multiclass for the
particular case of Aedes aegypti (90.23% $\pm$ 3.83) and the binary (88.49%
$\pm$ 6.68). The binary classifier and the multiclass classifier presented the
best balance between precision and recall, with F1-measure close to 90%.
Although the ensemble classifier achieved the lowest precision, thus impairing
its F1-measure (79.95% $\pm$ 2.13), it was the most powerful classifier to
detect Aedes aegypti in our dataset.
Related papers
- Automated detection of Zika and dengue in Aedes aegypti using neural
spiking analysis [8.034395623865906]
Aedes aegypti mosquitoes are primary vectors for numerous medically important viruses.
No open-source neural spike classification method is currently available for mosquitoes.
We present an innovative artificial intelligence-based method to classify the neural spikes in uninfected, dengue-infected, and Zika-infected mosquitoes.
arXiv Detail & Related papers (2023-12-14T04:52:54Z) - Acoustic Identification of Ae. aegypti Mosquitoes using Smartphone Apps
and Residual Convolutional Neural Networks [0.8808021343665321]
We advocate for smartphone apps as low-cost, easy-to-deploy solution for raising awareness among the population on the proliferation of Aedes aegypti mosquitoes.
devising such a smartphone app is challenging, for many reasons, including the required maturity level of techniques for identifying mosquitoes based on features that can be captured using smartphone resources.
arXiv Detail & Related papers (2023-06-16T13:41:01Z) - Attention-based Saliency Maps Improve Interpretability of Pneumothorax
Classification [52.77024349608834]
To investigate chest radiograph (CXR) classification performance of vision transformers (ViT) and interpretability of attention-based saliency.
ViTs were fine-tuned for lung disease classification using four public data sets: CheXpert, Chest X-Ray 14, MIMIC CXR, and VinBigData.
ViTs had comparable CXR classification AUCs compared with state-of-the-art CNNs.
arXiv Detail & Related papers (2023-03-03T12:05:41Z) - Application of Machine Learning to Sleep Stage Classification [0.7196441171503458]
Sleep studies are imperative to recapitulate phenotypes associated with sleep loss and uncover mechanisms contributing to psychopathology.
Most often, investigators manually classify the polysomnography into vigilance states, which is time-consuming and prone to inter-scorer variability.
We aim to produce an automated and open-access classifier that can reliably predict vigilance state based on a single EEG reading.
arXiv Detail & Related papers (2021-11-04T18:00:50Z) - A deep convolutional neural network for classification of Aedes
albopictus mosquitoes [1.6758573326215689]
We introduce the application of two Deep Convolutional Neural Networks in a comparative study to automate the classification task.
We use the transfer learning principle to train two state-of-the-art architectures on the data provided by the Mosquito Alert project.
In addition, we applied explainable models based on the Grad-CAM algorithm to visualise the most discriminant regions of the classified images.
arXiv Detail & Related papers (2021-10-29T17:58:32Z) - High performing ensemble of convolutional neural networks for insect
pest image detection [124.23179560022761]
Pest infestation is a major cause of crop damage and lost revenues worldwide.
We generate ensembles of CNNs based on different topologies.
Two new Adam algorithms for deep network optimization are proposed.
arXiv Detail & Related papers (2021-08-28T00:49:11Z) - An Efficient Insect Pest Classification Using Multiple Convolutional
Neural Network Based Models [0.3222802562733786]
Insect pest classification is a difficult task because of various kinds, scales, shapes, complex backgrounds in the field, and high appearance similarity among insect species.
We present different convolutional neural network-based models in this work, including attention, feature pyramid, and fine-grained models.
The experimental results show that combining these convolutional neural network-based models can better perform than the state-of-the-art methods on these two datasets.
arXiv Detail & Related papers (2021-07-26T12:53:28Z) - Ensemble of Convolution Neural Networks on Heterogeneous Signals for
Sleep Stage Scoring [63.30661835412352]
This paper explores and compares the convenience of using additional signals apart from electroencephalograms.
The best overall model, an ensemble of Depth-wise Separational Convolutional Neural Networks, has achieved an accuracy of 86.06%.
arXiv Detail & Related papers (2021-07-23T06:37:38Z) - Detecting COVID-19 from Breathing and Coughing Sounds using Deep Neural
Networks [68.8204255655161]
We adapt an ensemble of Convolutional Neural Networks to classify if a speaker is infected with COVID-19 or not.
Ultimately, it achieves an Unweighted Average Recall (UAR) of 74.9%, or an Area Under ROC Curve (AUC) of 80.7% by ensembling neural networks.
arXiv Detail & Related papers (2020-12-29T01:14:17Z) - Automatic sleep stage classification with deep residual networks in a
mixed-cohort setting [63.52264764099532]
We developed a novel deep neural network model to assess the generalizability of several large-scale cohorts.
Overall classification accuracy improved with increasing fractions of training data.
arXiv Detail & Related papers (2020-08-21T10:48:35Z) - Device-Robust Acoustic Scene Classification Based on Two-Stage
Categorization and Data Augmentation [63.98724740606457]
We present a joint effort of four groups, namely GT, USTC, Tencent, and UKE, to tackle Task 1 - Acoustic Scene Classification (ASC) in the DCASE 2020 Challenge.
Task 1a focuses on ASC of audio signals recorded with multiple (real and simulated) devices into ten different fine-grained classes.
Task 1b concerns with classification of data into three higher-level classes using low-complexity solutions.
arXiv Detail & Related papers (2020-07-16T15:07:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.