Robotic System with AI for Real Time Weed Detection, Canopy Aware Spraying, and Droplet Pattern Evaluation
- URL: http://arxiv.org/abs/2507.05432v1
- Date: Mon, 07 Jul 2025 19:27:29 GMT
- Title: Robotic System with AI for Real Time Weed Detection, Canopy Aware Spraying, and Droplet Pattern Evaluation
- Authors: Inayat Rasool, Pappu Kumar Yadav, Amee Parmar, Hasan Mirzakhaninafchi, Rikesh Budhathoki, Zain Ul Abideen Usmani, Supriya Paudel, Ivan Perez Olivera, Eric Jone,
- Abstract summary: We develop a vision guided, AI-driven variable rate sprayer system capable of detecting weed presence, estimating canopy size, and dynamically adjusting nozzle activation in real time.<n>The system integrates lightweight YOLO11n and YOLO11n-seg deep learning models, deployed on an NVIDIA Jetson Orin Nano for onboard inference.<n>Future work will focus on expanding the detection capabilities to include three common weed species in South Dakota.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Uniform and excessive herbicide application in modern agriculture contributes to increased input costs, environmental pollution, and the emergence of herbicide resistant weeds. To address these challenges, we developed a vision guided, AI-driven variable rate sprayer system capable of detecting weed presence, estimating canopy size, and dynamically adjusting nozzle activation in real time. The system integrates lightweight YOLO11n and YOLO11n-seg deep learning models, deployed on an NVIDIA Jetson Orin Nano for onboard inference, and uses an Arduino Uno-based relay interface to control solenoid actuated nozzles based on canopy segmentation results. Indoor trials were conducted using 15 potted Hibiscus rosa sinensis plants of varying canopy sizes to simulate a range of weed patch scenarios. The YOLO11n model achieved a mean average precision (mAP@50) of 0.98, with a precision of 0.99 and a recall close to 1.0. The YOLO11n-seg segmentation model achieved a mAP@50 of 0.48, precision of 0.55, and recall of 0.52. System performance was validated using water sensitive paper, which showed an average spray coverage of 24.22% in zones where canopy was present. An upward trend in mean spray coverage from 16.22% for small canopies to 21.46% and 21.65% for medium and large canopies, respectively, demonstrated the system's capability to adjust spray output based on canopy size in real time. These results highlight the potential of combining real time deep learning with low-cost embedded hardware for selective herbicide application. Future work will focus on expanding the detection capabilities to include three common weed species in South Dakota: water hemp (Amaranthus tuberculatus), kochia (Bassia scoparia), and foxtail (Setaria spp.), followed by further validation in both indoor and field trials within soybean and corn production systems.
Related papers
- SBP-YOLO:A Lightweight Real-Time Model for Detecting Speed Bumps and Potholes [30.847931539927753]
This paper proposes SBP-YOLO, a lightweight detection framework based on YOLOv11, optimized for embedded deployment.<n>The model integrates GhostConv for efficient computation, VoVGSCSPC for multi-scale feature enhancement, and a Efficiency Detection (LEDH) to reduce early-stage feature processing costs.
arXiv Detail & Related papers (2025-08-02T12:15:08Z) - An Improved YOLOv8 Approach for Small Target Detection of Rice Spikelet Flowering in Field Environments [1.0288898584996287]
This study proposes a rice spikelet flowering recognition method based on an improved YOLOv8 object detection model.<n>BiFPN replaces the original PANet structure to enhance feature fusion and improve multi-scale feature utilization.<n>Given the lack of publicly available datasets for rice spikelet flowering in field conditions, a high-resolution RGB camera and data augmentation techniques are used.
arXiv Detail & Related papers (2025-07-28T04:01:29Z) - Improving Lightweight Weed Detection via Knowledge Distillation [0.0]
We investigate Channel-wise Knowledge Distillation (CWD) and Masked Generative Distillation (MGD) to enhance the performance of lightweight models for real-time smart spraying systems.<n>CWD and MGD effectively transfer knowledge from the teacher to the student model.<n>We validate real-time deployment feasibility by evaluating the student YOLO11n model on Jetson Orin Nano and Raspberry Pi 5 embedded devices.
arXiv Detail & Related papers (2025-07-16T15:38:07Z) - Exploring Model Quantization in GenAI-based Image Inpainting and Detection of Arable Plants [0.0]
We propose a framework that leverages Stable Diffusion-based inpainting to augment training data progressively in 10% increments -- up to an additional 200%.<n>Our approach is evaluated on two state-of-the-art object detection models, YOLO11(l) and RT-DETR(l), using the mAP50 metric to assess detection performance.<n> Deployment of the downstream models on the Jetson Orin Nano demonstrates the practical viability of our framework in resource-constrained environments.
arXiv Detail & Related papers (2025-03-04T09:05:01Z) - WeedVision: Multi-Stage Growth and Classification of Weeds using DETR and RetinaNet for Precision Agriculture [0.0]
This research uses object detection models to identify and classify 16 weed species of economic concern across 174 classes.<n>A robust dataset comprising 203,567 images was developed, meticulously labeled by species and growth stage.<n>RetinaNet demonstrated superior performance, achieving a mean Average Precision (mAP) of 0.907 on the training set and 0.904 on the test set.
arXiv Detail & Related papers (2025-02-16T20:49:22Z) - SCott: Accelerating Diffusion Models with Stochastic Consistency Distillation [74.32186107058382]
We propose Consistency Distillation (SCott) to enable accelerated text-to-image generation.<n>SCott distills the ordinary differential equation solvers-based sampling process of a pre-trained teacher model into a student.<n>On the MSCOCO-2017 5K dataset with a Stable Diffusion-V1.5 teacher, SCott achieves an FID of 21.9 with 2 sampling steps, surpassing that of the 1-step InstaFlow (23.4) and the 4-step UFOGen (22.1)
arXiv Detail & Related papers (2024-03-03T13:08:32Z) - Ultra-low Power Deep Learning-based Monocular Relative Localization
Onboard Nano-quadrotors [64.68349896377629]
This work presents a novel autonomous end-to-end system that addresses the monocular relative localization, through deep neural networks (DNNs), of two peer nano-drones.
To cope with the ultra-constrained nano-drone platform, we propose a vertically-integrated framework, including dataset augmentation, quantization, and system optimizations.
Experimental results show that our DNN can precisely localize a 10cm-size target nano-drone by employing only low-resolution monochrome images, up to 2m distance.
arXiv Detail & Related papers (2023-03-03T14:14:08Z) - Computer Vision for Volunteer Cotton Detection in a Corn Field with UAS
Remote Sensing Imagery and Spot Spray Applications [5.293431074053198]
To control boll weevil (Anthonomus grandis L.) pest re-infestation in cotton fields, the current practices of volunteer cotton (VC) plant detection involve manual field scouting at the edges of fields.
We present the application of YOLOv5m on radiometrically and gamma-corrected low resolution (1.2 Megapixel) multispectral imagery for detecting and locating VC plants growing in the middle of tasseling (VT) growth stage of cornfield.
arXiv Detail & Related papers (2022-07-15T08:13:20Z) - Deep-CNN based Robotic Multi-Class Under-Canopy Weed Control in
Precision Farming [2.6085535710135654]
Real-time multi-class weed identification enables species-specific treatment of weeds and significantly reduces the amount of herbicide use.
Here, we present a baseline for classification performance using five benchmark CNN models.
We deploy MobileNetV2 onto our own compact autonomous robot textitSAMBot for real-time weed detection.
arXiv Detail & Related papers (2021-12-28T03:51:55Z) - A Multi-Stage model based on YOLOv3 for defect detection in PV panels
based on IR and Visible Imaging by Unmanned Aerial Vehicle [65.99880594435643]
We propose a novel model to detect panel defects on aerial images captured by unmanned aerial vehicle.
The model combines detections of panels and defects to refine its accuracy.
The proposed model has been validated on two big PV plants in the south of Italy.
arXiv Detail & Related papers (2021-11-23T08:04:32Z) - Performance Evaluation of Deep Transfer Learning on Multiclass
Identification of Common Weed Species in Cotton Production Systems [3.427330019009861]
This paper makes a first comprehensive evaluation of deep transfer learning (DTL) for identifying weeds specific to cotton production systems in southern United States.
A new dataset for weed identification was created, consisting of 5187 color images of 15 weed classes collected under natural lighting conditions and at varied weed growth stages.
DTL achieved high classification accuracy of F1 scores exceeding 95%, requiring reasonably short training time (less than 2.5 hours) across models.
arXiv Detail & Related papers (2021-10-11T01:51:48Z) - A CNN Approach to Simultaneously Count Plants and Detect Plantation-Rows
from UAV Imagery [56.10033255997329]
We propose a novel deep learning method based on a Convolutional Neural Network (CNN)
It simultaneously detects and geolocates plantation-rows while counting its plants considering highly-dense plantation configurations.
The proposed method achieved state-of-the-art performance for counting and geolocating plants and plant-rows in UAV images from different types of crops.
arXiv Detail & Related papers (2020-12-31T18:51:17Z) - Estimating Crop Primary Productivity with Sentinel-2 and Landsat 8 using
Machine Learning Methods Trained with Radiative Transfer Simulations [58.17039841385472]
We take advantage of all parallel developments in mechanistic modeling and satellite data availability for advanced monitoring of crop productivity.
Our model successfully estimates gross primary productivity across a variety of C3 crop types and environmental conditions even though it does not use any local information from the corresponding sites.
This highlights its potential to map crop productivity from new satellite sensors at a global scale with the help of current Earth observation cloud computing platforms.
arXiv Detail & Related papers (2020-12-07T16:23:13Z) - CovidDeep: SARS-CoV-2/COVID-19 Test Based on Wearable Medical Sensors
and Efficient Neural Networks [51.589769497681175]
The novel coronavirus (SARS-CoV-2) has led to a pandemic.
The current testing regime based on Reverse Transcription-Polymerase Chain Reaction for SARS-CoV-2 has been unable to keep up with testing demands.
We propose a framework called CovidDeep that combines efficient DNNs with commercially available WMSs for pervasive testing of the virus.
arXiv Detail & Related papers (2020-07-20T21:47:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.