Weed Density and Distribution Estimation for Precision Agriculture using
Semi-Supervised Learning
- URL: http://arxiv.org/abs/2011.02193v2
- Date: Thu, 18 Feb 2021 14:05:01 GMT
- Title: Weed Density and Distribution Estimation for Precision Agriculture using
Semi-Supervised Learning
- Authors: Shantam Shorewala, Armaan Ashfaque, Sidharth R and Ujjwal Verma
- Abstract summary: We propose a deep learning-based semi-supervised approach for robust estimation of weed density and distribution.
In this work, the foreground vegetation pixels containing crops and weeds are first identified using a Convolutional Neural Network (CNN) based unsupervised segmentation.
The weed infected regions are identified using a fine-tuned CNN, eliminating the need for designing hand-crafted features.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Uncontrolled growth of weeds can severely affect the crop yield and quality.
Unrestricted use of herbicide for weed removal alters biodiversity and cause
environmental pollution. Instead, identifying weed-infested regions can aid
selective chemical treatment of these regions. Advances in analyzing farm
images have resulted in solutions to identify weed plants. However, a majority
of these approaches are based on supervised learning methods which requires
huge amount of manually annotated images. As a result, these supervised
approaches are economically infeasible for the individual farmer because of the
wide variety of plant species being cultivated. In this paper, we propose a
deep learning-based semi-supervised approach for robust estimation of weed
density and distribution across farmlands using only limited color images
acquired from autonomous robots. This weed density and distribution can be
useful in a site-specific weed management system for selective treatment of
infected areas using autonomous robots. In this work, the foreground vegetation
pixels containing crops and weeds are first identified using a Convolutional
Neural Network (CNN) based unsupervised segmentation. Subsequently, the weed
infected regions are identified using a fine-tuned CNN, eliminating the need
for designing hand-crafted features. The approach is validated on two datasets
of different crop/weed species (1) Crop Weed Field Image Dataset (CWFID), which
consists of carrot plant images and the (2) Sugar Beets dataset. The proposed
method is able to localize weed-infested regions a maximum recall of 0.99 and
estimate weed density with a maximum accuracy of 82.13%. Hence, the proposed
approach is shown to generalize to different plant species without the need for
extensive labeled data.
Related papers
- Weed Detection using Convolutional Neural Network [0.0]
We use convolutional neural networks (CNNs) for weed detection in agricultural land.
We specifically investigate the application of two CNN layer types, Conv2d and dilated Conv2d, for weed detection in crop fields.
The suggested method extracts features from the input photos using pre-trained models, which are subsequently adjusted for weed detection.
arXiv Detail & Related papers (2025-02-20T08:37:23Z) - WeedsGalore: A Multispectral and Multitemporal UAV-based Dataset for Crop and Weed Segmentation in Agricultural Maize Fields [0.7421845364041001]
Weeds are one of the major reasons for crop yield loss but current weeding practices fail to manage weeds in an efficient and targeted manner.
We present a novel dataset for semantic and instance segmentation of crops and weeds in agricultural maize fields.
arXiv Detail & Related papers (2025-02-18T18:13:19Z) - Towards Efficient and Intelligent Laser Weeding: Method and Dataset for Weed Stem Detection [51.65457287518379]
This study is the first empirical investigation of weed recognition for laser weeding.
We integrate the detection of crop and weed with the localization of weed stem into one end-to-end system.
The proposed system improves weeding accuracy by 6.7% and reduces energy cost by 32.3% compared to existing weed recognition systems.
arXiv Detail & Related papers (2025-02-10T08:42:46Z) - Exploiting Boundary Loss for the Hierarchical Panoptic Segmentation of Plants and Leaves [0.3659498819753633]
We propose a hierarchical panoptic segmentation method that simultaneously determines leaf count and locates weeds within an image.
Not only does this result in competitive performance, achieving a PQ+ of 81.89 on the standard training set, but we also demonstrate we can improve leaf-counting accuracy with our method.
arXiv Detail & Related papers (2024-12-31T16:23:58Z) - HarvestNet: A Dataset for Detecting Smallholder Farming Activity Using
Harvest Piles and Remote Sensing [50.4506590177605]
HarvestNet is a dataset for mapping the presence of farms in the Ethiopian regions of Tigray and Amhara during 2020-2023.
We introduce a new approach based on the detection of harvest piles characteristic of many smallholder systems.
We conclude that remote sensing of harvest piles can contribute to more timely and accurate cropland assessments in food insecure regions.
arXiv Detail & Related papers (2023-08-23T11:03:28Z) - End-to-end deep learning for directly estimating grape yield from
ground-based imagery [53.086864957064876]
This study demonstrates the application of proximal imaging combined with deep learning for yield estimation in vineyards.
Three model architectures were tested: object detection, CNN regression, and transformer models.
The study showed the applicability of proximal imaging and deep learning for prediction of grapevine yield on a large scale.
arXiv Detail & Related papers (2022-08-04T01:34:46Z) - Supervised learning for crop/weed classification based on color and
texture features [0.0]
This paper investigates the use of color and texture features for discrimination of Soybean crops and weeds.
Experiment was carried out on image dataset of soybean crop, obtained from an unmanned aerial vehicle (UAV)
arXiv Detail & Related papers (2021-06-19T22:31:54Z) - Potato Crop Stress Identification in Aerial Images using Deep
Learning-based Object Detection [60.83360138070649]
The paper presents an approach for analyzing aerial images of a potato crop using deep neural networks.
The main objective is to demonstrate automated spatial recognition of a healthy versus stressed crop at a plant level.
Experimental validation demonstrated the ability for distinguishing healthy and stressed plants in field images, achieving an average Dice coefficient of 0.74.
arXiv Detail & Related papers (2021-06-14T21:57:40Z) - A CNN Approach to Simultaneously Count Plants and Detect Plantation-Rows
from UAV Imagery [56.10033255997329]
We propose a novel deep learning method based on a Convolutional Neural Network (CNN)
It simultaneously detects and geolocates plantation-rows while counting its plants considering highly-dense plantation configurations.
The proposed method achieved state-of-the-art performance for counting and geolocating plants and plant-rows in UAV images from different types of crops.
arXiv Detail & Related papers (2020-12-31T18:51:17Z) - Two-View Fine-grained Classification of Plant Species [66.75915278733197]
We propose a novel method based on a two-view leaf image representation and a hierarchical classification strategy for fine-grained recognition of plant species.
A deep metric based on Siamese convolutional neural networks is used to reduce the dependence on a large number of training samples and make the method scalable to new plant species.
arXiv Detail & Related papers (2020-05-18T21:57:47Z) - Deep Transfer Learning For Plant Center Localization [19.322420819302263]
This paper investigates methods that estimate plant locations for a field-based crop using RGB aerial images captured using Unmanned Aerial Vehicles (UAVs)
Deep learning approaches provide promising capability for locating plants observed in RGB images, but they require large quantities of labeled data (ground truth) for training.
We propose a method for estimating plant centers by transferring an existing model to a new scenario using limited ground truth data.
arXiv Detail & Related papers (2020-04-29T06:29:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.