Weed Density and Distribution Estimation for Precision Agriculture using
Semi-Supervised Learning
- URL: http://arxiv.org/abs/2011.02193v2
- Date: Thu, 18 Feb 2021 14:05:01 GMT
- Title: Weed Density and Distribution Estimation for Precision Agriculture using
Semi-Supervised Learning
- Authors: Shantam Shorewala, Armaan Ashfaque, Sidharth R and Ujjwal Verma
- Abstract summary: We propose a deep learning-based semi-supervised approach for robust estimation of weed density and distribution.
In this work, the foreground vegetation pixels containing crops and weeds are first identified using a Convolutional Neural Network (CNN) based unsupervised segmentation.
The weed infected regions are identified using a fine-tuned CNN, eliminating the need for designing hand-crafted features.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Uncontrolled growth of weeds can severely affect the crop yield and quality.
Unrestricted use of herbicide for weed removal alters biodiversity and cause
environmental pollution. Instead, identifying weed-infested regions can aid
selective chemical treatment of these regions. Advances in analyzing farm
images have resulted in solutions to identify weed plants. However, a majority
of these approaches are based on supervised learning methods which requires
huge amount of manually annotated images. As a result, these supervised
approaches are economically infeasible for the individual farmer because of the
wide variety of plant species being cultivated. In this paper, we propose a
deep learning-based semi-supervised approach for robust estimation of weed
density and distribution across farmlands using only limited color images
acquired from autonomous robots. This weed density and distribution can be
useful in a site-specific weed management system for selective treatment of
infected areas using autonomous robots. In this work, the foreground vegetation
pixels containing crops and weeds are first identified using a Convolutional
Neural Network (CNN) based unsupervised segmentation. Subsequently, the weed
infected regions are identified using a fine-tuned CNN, eliminating the need
for designing hand-crafted features. The approach is validated on two datasets
of different crop/weed species (1) Crop Weed Field Image Dataset (CWFID), which
consists of carrot plant images and the (2) Sugar Beets dataset. The proposed
method is able to localize weed-infested regions a maximum recall of 0.99 and
estimate weed density with a maximum accuracy of 82.13%. Hence, the proposed
approach is shown to generalize to different plant species without the need for
extensive labeled data.
Related papers
- HarvestNet: A Dataset for Detecting Smallholder Farming Activity Using
Harvest Piles and Remote Sensing [50.4506590177605]
HarvestNet is a dataset for mapping the presence of farms in the Ethiopian regions of Tigray and Amhara during 2020-2023.
We introduce a new approach based on the detection of harvest piles characteristic of many smallholder systems.
We conclude that remote sensing of harvest piles can contribute to more timely and accurate cropland assessments in food insecure regions.
arXiv Detail & Related papers (2023-08-23T11:03:28Z) - Semantic Image Segmentation with Deep Learning for Vine Leaf Phenotyping [59.0626764544669]
In this study, we use Deep Learning methods to semantically segment grapevine leaves images in order to develop an automated object detection system for leaf phenotyping.
Our work contributes to plant lifecycle monitoring through which dynamic traits such as growth and development can be captured and quantified.
arXiv Detail & Related papers (2022-10-24T14:37:09Z) - Transferring learned patterns from ground-based field imagery to predict
UAV-based imagery for crop and weed semantic segmentation in precision crop
farming [3.95486899327898]
We have developed a deep convolutional network that enables to predict both field and aerial images from UAVs for weed segmentation.
The network learning process is visualized by feature maps at shallow and deep layers.
The study shows that the developed deep convolutional neural network could be used to classify weeds from both field and aerial images.
arXiv Detail & Related papers (2022-10-20T19:25:06Z) - End-to-end deep learning for directly estimating grape yield from
ground-based imagery [53.086864957064876]
This study demonstrates the application of proximal imaging combined with deep learning for yield estimation in vineyards.
Three model architectures were tested: object detection, CNN regression, and transformer models.
The study showed the applicability of proximal imaging and deep learning for prediction of grapevine yield on a large scale.
arXiv Detail & Related papers (2022-08-04T01:34:46Z) - 4Weed Dataset: Annotated Imagery Weeds Dataset [1.5484595752241122]
The dataset consists of 159 Cocklebur images, 139 Foxtail images, 170 Redroot Pigweed images and 150 Giant Ragweed images.
Bounding box annotations were created for each image to prepare the dataset for training both image classification and object detection deep learning networks.
arXiv Detail & Related papers (2022-03-29T03:10:54Z) - Supervised learning for crop/weed classification based on color and
texture features [0.0]
This paper investigates the use of color and texture features for discrimination of Soybean crops and weeds.
Experiment was carried out on image dataset of soybean crop, obtained from an unmanned aerial vehicle (UAV)
arXiv Detail & Related papers (2021-06-19T22:31:54Z) - Potato Crop Stress Identification in Aerial Images using Deep
Learning-based Object Detection [60.83360138070649]
The paper presents an approach for analyzing aerial images of a potato crop using deep neural networks.
The main objective is to demonstrate automated spatial recognition of a healthy versus stressed crop at a plant level.
Experimental validation demonstrated the ability for distinguishing healthy and stressed plants in field images, achieving an average Dice coefficient of 0.74.
arXiv Detail & Related papers (2021-06-14T21:57:40Z) - A Survey of Deep Learning Techniques for Weed Detection from Images [4.96981595868944]
We review existing deep learning-based weed detection and classification techniques.
We find that most studies applied supervised learning techniques, they achieved high classification accuracy.
Past experiments have already achieved high accuracy when a large amount of labelled data is available.
arXiv Detail & Related papers (2021-03-02T02:02:24Z) - A CNN Approach to Simultaneously Count Plants and Detect Plantation-Rows
from UAV Imagery [56.10033255997329]
We propose a novel deep learning method based on a Convolutional Neural Network (CNN)
It simultaneously detects and geolocates plantation-rows while counting its plants considering highly-dense plantation configurations.
The proposed method achieved state-of-the-art performance for counting and geolocating plants and plant-rows in UAV images from different types of crops.
arXiv Detail & Related papers (2020-12-31T18:51:17Z) - Two-View Fine-grained Classification of Plant Species [66.75915278733197]
We propose a novel method based on a two-view leaf image representation and a hierarchical classification strategy for fine-grained recognition of plant species.
A deep metric based on Siamese convolutional neural networks is used to reduce the dependence on a large number of training samples and make the method scalable to new plant species.
arXiv Detail & Related papers (2020-05-18T21:57:47Z) - Deep Transfer Learning For Plant Center Localization [19.322420819302263]
This paper investigates methods that estimate plant locations for a field-based crop using RGB aerial images captured using Unmanned Aerial Vehicles (UAVs)
Deep learning approaches provide promising capability for locating plants observed in RGB images, but they require large quantities of labeled data (ground truth) for training.
We propose a method for estimating plant centers by transferring an existing model to a new scenario using limited ground truth data.
arXiv Detail & Related papers (2020-04-29T06:29:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.