Deep learning powered real-time identification of insects using citizen
science data
- URL: http://arxiv.org/abs/2306.02507v1
- Date: Sun, 4 Jun 2023 23:56:53 GMT
- Title: Deep learning powered real-time identification of insects using citizen
science data
- Authors: Shivani Chiranjeevi, Mojdeh Sadaati, Zi K Deng, Jayanth Koushik,
Talukder Z Jubery, Daren Mueller, Matthew E O Neal, Nirav Merchant, Aarti
Singh, Asheesh K Singh, Soumik Sarkar, Arti Singh, Baskar
Ganapathysubramanian
- Abstract summary: InsectNet can identify invasive species, provide fine-grained insect species identification, and work effectively in challenging backgrounds.
It can also abstain from making predictions when uncertain, facilitating seamless human intervention and making it a practical and trustworthy tool.
- Score: 17.13608307250744
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Insect-pests significantly impact global agricultural productivity and
quality. Effective management involves identifying the full insect community,
including beneficial insects and harmful pests, to develop and implement
integrated pest management strategies. Automated identification of insects
under real-world conditions presents several challenges, including
differentiating similar-looking species, intra-species dissimilarity and
inter-species similarity, several life cycle stages, camouflage, diverse
imaging conditions, and variability in insect orientation. A deep-learning
model, InsectNet, is proposed to address these challenges. InsectNet is endowed
with five key features: (a) utilization of a large dataset of insect images
collected through citizen science; (b) label-free self-supervised learning for
large models; (c) improving prediction accuracy for species with a small sample
size; (d) enhancing model trustworthiness; and (e) democratizing access through
streamlined MLOps. This approach allows accurate identification (>96% accuracy)
of over 2500 insect species, including pollinator (e.g., butterflies, bees),
parasitoid (e.g., some wasps and flies), predator species (e.g., lady beetles,
mantises, dragonflies) and harmful pest species (e.g., armyworms, cutworms,
grasshoppers, stink bugs). InsectNet can identify invasive species, provide
fine-grained insect species identification, and work effectively in challenging
backgrounds. It also can abstain from making predictions when uncertain,
facilitating seamless human intervention and making it a practical and
trustworthy tool. InsectNet can guide citizen science data collection,
especially for invasive species where early detection is crucial. Similar
approaches may transform other agricultural challenges like disease detection
and underscore the importance of data collection, particularly through citizen
science efforts..
Related papers
- Deep-Wide Learning Assistance for Insect Pest Classification [1.9912919001438378]
We present DeWi, novel learning assistance for insect pest classification.
With a one-stage and alternating training strategy, DeWi simultaneously improves several Convolutional Neural Networks.
Experimental results show that DeWi achieves the highest performances on two insect pest classification benchmarks.
arXiv Detail & Related papers (2024-09-16T16:29:41Z) - Artificial Immune System of Secure Face Recognition Against Adversarial Attacks [67.31542713498627]
optimisation is required for insect production to realise its full potential.
This can be by targeted improvement of traits of interest through selective breeding.
This review combines knowledge from diverse disciplines, bridging the gap between animal breeding, quantitative genetics, evolutionary biology, and entomology.
arXiv Detail & Related papers (2024-06-26T07:50:58Z) - A machine learning pipeline for automated insect monitoring [17.034158815607128]
Camera traps, conventionally used for monitoring terrestrial vertebrates, are now being modified for insects, especially moths.
We describe a complete, open-source machine learning-based software pipeline for automated monitoring of moths via camera traps.
arXiv Detail & Related papers (2024-06-18T19:51:16Z) - Unleashing the Power of Transfer Learning Model for Sophisticated Insect Detection: Revolutionizing Insect Classification [0.520707246175575]
This study uses different models like MobileNetV2, ResNet152V2, Xecption, Custom CNN.
A Convolutional Neural Network (CNN) based on the ResNet152V2 architecture is constructed and evaluated in this work.
The results highlight its potential for real-world applications in insect classification and entomology studies.
arXiv Detail & Related papers (2024-06-11T20:52:42Z) - InsectMamba: Insect Pest Classification with State Space Model [8.470757741028661]
InsectMamba is a novel approach that integrates State Space Models (SSMs), Convolutional Neural Networks (CNNs), Multi-Head Self-Attention mechanism (MSA) and Multilayer Perceptrons (MLPs) within Mix-SSM blocks.
It was evaluated against strong competitors across five insect pest classification datasets.
arXiv Detail & Related papers (2024-04-04T17:34:21Z) - Insect-Foundation: A Foundation Model and Large-scale 1M Dataset for Visual Insect Understanding [15.383106771910274]
Current machine vision model requires a large volume of data to achieve high performance.
We introduce a novel "Insect-1M" dataset, a game-changing resource poised to revolutionize insect-related foundation model training.
Covering a vast spectrum of insect species, our dataset, including 1 million images with dense identification labels of taxonomy hierarchy and insect descriptions, offers a panoramic view of entomology.
arXiv Detail & Related papers (2023-11-26T06:17:29Z) - SatBird: Bird Species Distribution Modeling with Remote Sensing and
Citizen Science Data [68.2366021016172]
We present SatBird, a satellite dataset of locations in the USA with labels derived from presence-absence observation data from the citizen science database eBird.
We also provide a dataset in Kenya representing low-data regimes.
We benchmark a set of baselines on our dataset, including SOTA models for remote sensing tasks.
arXiv Detail & Related papers (2023-11-02T02:00:27Z) - Spatial Implicit Neural Representations for Global-Scale Species Mapping [72.92028508757281]
Given a set of locations where a species has been observed, the goal is to build a model to predict whether the species is present or absent at any location.
Traditional methods struggle to take advantage of emerging large-scale crowdsourced datasets.
We use Spatial Implicit Neural Representations (SINRs) to jointly estimate the geographical range of 47k species simultaneously.
arXiv Detail & Related papers (2023-06-05T03:36:01Z) - Intestinal Parasites Classification Using Deep Belief Networks [53.20999552522241]
$4$ billion people are infected by intestinal parasites worldwide.
Human visual inspection is still in charge of the vast majority of clinical diagnoses.
We introduce Deep Belief Networks to the context of automatic intestinal parasites classification.
arXiv Detail & Related papers (2021-01-17T18:47:02Z) - One-Shot Learning with Triplet Loss for Vegetation Classification Tasks [45.82374977939355]
Triplet loss function is one of the options that can significantly improve the accuracy of the One-shot Learning tasks.
Starting from 2015, many projects use Siamese networks and this kind of loss for face recognition and object classification.
arXiv Detail & Related papers (2020-12-14T10:44:22Z) - Automatic image-based identification and biomass estimation of
invertebrates [70.08255822611812]
Time-consuming sorting and identification of taxa pose strong limitations on how many insect samples can be processed.
We propose to replace the standard manual approach of human expert-based sorting and identification with an automatic image-based technology.
We use state-of-the-art Resnet-50 and InceptionV3 CNNs for the classification task.
arXiv Detail & Related papers (2020-02-05T21:38:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.