Transferring learned patterns from ground-based field imagery to predict
UAV-based imagery for crop and weed semantic segmentation in precision crop
farming
- URL: http://arxiv.org/abs/2210.11545v1
- Date: Thu, 20 Oct 2022 19:25:06 GMT
- Title: Transferring learned patterns from ground-based field imagery to predict
UAV-based imagery for crop and weed semantic segmentation in precision crop
farming
- Authors: Junfeng Gao, Wenzhi Liao, David Nuyttens, Peter Lootens, Erik
Alexandersson, Jan Pieters
- Abstract summary: We have developed a deep convolutional network that enables to predict both field and aerial images from UAVs for weed segmentation.
The network learning process is visualized by feature maps at shallow and deep layers.
The study shows that the developed deep convolutional neural network could be used to classify weeds from both field and aerial images.
- Score: 3.95486899327898
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Weed and crop segmentation is becoming an increasingly integral part of
precision farming that leverages the current computer vision and deep learning
technologies. Research has been extensively carried out based on images
captured with a camera from various platforms. Unmanned aerial vehicles (UAVs)
and ground-based vehicles including agricultural robots are the two popular
platforms for data collection in fields. They all contribute to site-specific
weed management (SSWM) to maintain crop yield. Currently, the data from these
two platforms is processed separately, though sharing the same semantic objects
(weed and crop). In our paper, we have developed a deep convolutional network
that enables to predict both field and aerial images from UAVs for weed
segmentation and mapping with only field images provided in the training phase.
The network learning process is visualized by feature maps at shallow and deep
layers. The results show that the mean intersection of union (IOU) values of
the segmentation for the crop (maize), weeds, and soil background in the
developed model for the field dataset are 0.744, 0.577, 0.979, respectively,
and the performance of aerial images from an UAV with the same model, the IOU
values of the segmentation for the crop (maize), weeds and soil background are
0.596, 0.407, and 0.875, respectively. To estimate the effect on the use of
plant protection agents, we quantify the relationship between herbicide
spraying saving rate and grid size (spraying resolution) based on the predicted
weed map. The spraying saving rate is up to 90% when the spraying resolution is
at 1.78 x 1.78 cm2. The study shows that the developed deep convolutional
neural network could be used to classify weeds from both field and aerial
images and delivers satisfactory results.
Related papers
- SatSynth: Augmenting Image-Mask Pairs through Diffusion Models for Aerial Semantic Segmentation [69.42764583465508]
We explore the potential of generative image diffusion to address the scarcity of annotated data in earth observation tasks.
To the best of our knowledge, we are the first to generate both images and corresponding masks for satellite segmentation.
arXiv Detail & Related papers (2024-03-25T10:30:22Z) - HarvestNet: A Dataset for Detecting Smallholder Farming Activity Using
Harvest Piles and Remote Sensing [50.4506590177605]
HarvestNet is a dataset for mapping the presence of farms in the Ethiopian regions of Tigray and Amhara during 2020-2023.
We introduce a new approach based on the detection of harvest piles characteristic of many smallholder systems.
We conclude that remote sensing of harvest piles can contribute to more timely and accurate cropland assessments in food insecure regions.
arXiv Detail & Related papers (2023-08-23T11:03:28Z) - End-to-end deep learning for directly estimating grape yield from
ground-based imagery [53.086864957064876]
This study demonstrates the application of proximal imaging combined with deep learning for yield estimation in vineyards.
Three model architectures were tested: object detection, CNN regression, and transformer models.
The study showed the applicability of proximal imaging and deep learning for prediction of grapevine yield on a large scale.
arXiv Detail & Related papers (2022-08-04T01:34:46Z) - Agricultural Plant Cataloging and Establishment of a Data Framework from
UAV-based Crop Images by Computer Vision [4.0382342610484425]
We present a hands-on workflow for the automatized temporal and spatial identification and individualization of crop images from UAVs.
The presented approach improves analysis and interpretation of UAV data in agriculture significantly.
arXiv Detail & Related papers (2022-01-08T21:14:07Z) - Development of Automatic Tree Counting Software from UAV Based Aerial
Images With Machine Learning [0.0]
This study aims to automatically count trees in designated areas on the Siirt University campus from high-resolution images obtained by UAV.
Images obtained at 30 meters height with 20% overlap were stitched offline at the ground station using Adobe Photoshop's photo merge tool.
arXiv Detail & Related papers (2022-01-07T22:32:08Z) - Weed Recognition using Deep Learning Techniques on Class-imbalanced
Imagery [4.96981595868944]
We have investigated five state-of-the-art deep neural networks and evaluated their performance for weed recognition.
VGG16 performed better than others on small-scale datasets, while ResNet-50 performed better than other deep networks on the large combined dataset.
arXiv Detail & Related papers (2021-12-15T01:00:05Z) - Potato Crop Stress Identification in Aerial Images using Deep
Learning-based Object Detection [60.83360138070649]
The paper presents an approach for analyzing aerial images of a potato crop using deep neural networks.
The main objective is to demonstrate automated spatial recognition of a healthy versus stressed crop at a plant level.
Experimental validation demonstrated the ability for distinguishing healthy and stressed plants in field images, achieving an average Dice coefficient of 0.74.
arXiv Detail & Related papers (2021-06-14T21:57:40Z) - A CNN Approach to Simultaneously Count Plants and Detect Plantation-Rows
from UAV Imagery [56.10033255997329]
We propose a novel deep learning method based on a Convolutional Neural Network (CNN)
It simultaneously detects and geolocates plantation-rows while counting its plants considering highly-dense plantation configurations.
The proposed method achieved state-of-the-art performance for counting and geolocating plants and plant-rows in UAV images from different types of crops.
arXiv Detail & Related papers (2020-12-31T18:51:17Z) - Weed Density and Distribution Estimation for Precision Agriculture using
Semi-Supervised Learning [0.0]
We propose a deep learning-based semi-supervised approach for robust estimation of weed density and distribution.
In this work, the foreground vegetation pixels containing crops and weeds are first identified using a Convolutional Neural Network (CNN) based unsupervised segmentation.
The weed infected regions are identified using a fine-tuned CNN, eliminating the need for designing hand-crafted features.
arXiv Detail & Related papers (2020-11-04T09:35:53Z) - Agriculture-Vision: A Large Aerial Image Database for Agricultural
Pattern Analysis [110.30849704592592]
We present Agriculture-Vision: a large-scale aerial farmland image dataset for semantic segmentation of agricultural patterns.
Each image consists of RGB and Near-infrared (NIR) channels with resolution as high as 10 cm per pixel.
We annotate nine types of field anomaly patterns that are most important to farmers.
arXiv Detail & Related papers (2020-01-05T20:19:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.