High-Throughput Image-Based Plant Stand Count Estimation Using
Convolutional Neural Networks
- URL: http://arxiv.org/abs/2010.12552v1
- Date: Fri, 23 Oct 2020 17:28:29 GMT
- Title: High-Throughput Image-Based Plant Stand Count Estimation Using
Convolutional Neural Networks
- Authors: Saeed Khaki, Hieu Pham, Ye Han, Wade Kent and Lizhi Wang
- Abstract summary: We propose a deep learning based approach, named DeepStand, for image-based corn stand counting at early phenological stages.
Our proposed method can successfully count corn stands and out-perform other state-of-the-art methods.
- Score: 23.67862313758282
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The future landscape of modern farming and plant breeding is rapidly changing
due to the complex needs of our society. The explosion of collectable data has
started a revolution in agriculture to the point where innovation must occur.
To a commercial organization, the accurate and efficient collection of
information is necessary to ensure that optimal decisions are made at key
points of the breeding cycle. However, due to the shear size of a breeding
program and current resource limitations, the ability to collect precise data
on individual plants is not possible. In particular, efficient phenotyping of
crops to record its color, shape, chemical properties, disease susceptibility,
etc. is severely limited due to labor requirements and, oftentimes, expert
domain knowledge. In this paper, we propose a deep learning based approach,
named DeepStand, for image-based corn stand counting at early phenological
stages. The proposed method adopts a truncated VGG-16 network as a backbone
feature extractor and merges multiple feature maps with different scales to
make the network robust against scale variation. Our extensive computational
experiments suggest that our proposed method can successfully count corn stands
and out-perform other state-of-the-art methods. It is the goal of our work to
be used by the larger agricultural community as a way to enable high-throughput
phenotyping without the use of extensive time and labor requirements.
Related papers
- Generating Diverse Agricultural Data for Vision-Based Farming Applications [74.79409721178489]
This model is capable of simulating distinct growth stages of plants, diverse soil conditions, and randomized field arrangements under varying lighting conditions.
Our dataset includes 12,000 images with semantic labels, offering a comprehensive resource for computer vision tasks in precision agriculture.
arXiv Detail & Related papers (2024-03-27T08:42:47Z) - BonnBeetClouds3D: A Dataset Towards Point Cloud-based Organ-level
Phenotyping of Sugar Beet Plants under Field Conditions [30.27773980916216]
Agricultural production is facing severe challenges in the next decades induced by climate change and the need for sustainability.
Advancements in field management through non-chemical weeding by robots in combination with monitoring of crops by autonomous unmanned aerial vehicles (UAVs) are helpful to address these challenges.
The analysis of plant traits, called phenotyping, is an essential activity in plant breeding, it however involves a great amount of manual labor.
arXiv Detail & Related papers (2023-12-22T14:06:44Z) - Semantic Image Segmentation with Deep Learning for Vine Leaf Phenotyping [59.0626764544669]
In this study, we use Deep Learning methods to semantically segment grapevine leaves images in order to develop an automated object detection system for leaf phenotyping.
Our work contributes to plant lifecycle monitoring through which dynamic traits such as growth and development can be captured and quantified.
arXiv Detail & Related papers (2022-10-24T14:37:09Z) - Semantic Segmentation of Vegetation in Remote Sensing Imagery Using Deep
Learning [77.34726150561087]
We propose an approach for creating a multi-modal and large-temporal dataset comprised of publicly available Remote Sensing data.
We use Convolutional Neural Networks (CNN) models that are capable of separating different classes of vegetation.
arXiv Detail & Related papers (2022-09-28T18:51:59Z) - Estimaci\'on de \'areas de cultivo mediante Deep Learning y
programaci\'on convencional [0.0]
We have considered as a case study one of the most recognized companies in the planting and harvesting of sugar cane in Ecuador.
The strategy combines a Generative Adversarial Neural Network (GAN) that is trained on a dataset of aerial photographs of sugar cane plots to distinguish populated or unpopulated crop areas.
The experiments performed demonstrate a significant improvement in the quality of the aerial photographs.
arXiv Detail & Related papers (2022-07-25T16:22:55Z) - An Applied Deep Learning Approach for Estimating Soybean Relative
Maturity from UAV Imagery to Aid Plant Breeding Decisions [7.4022258821325115]
We develop a robust and automatic approach for estimating the relative maturity of soybeans using a time series of UAV images.
An end-to-end hybrid model combining Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) is proposed to extract features.
Results suggest the effectiveness of our proposed CNN-LSTM model compared to the local regression method.
arXiv Detail & Related papers (2021-08-02T14:53:58Z) - Potato Crop Stress Identification in Aerial Images using Deep
Learning-based Object Detection [60.83360138070649]
The paper presents an approach for analyzing aerial images of a potato crop using deep neural networks.
The main objective is to demonstrate automated spatial recognition of a healthy versus stressed crop at a plant level.
Experimental validation demonstrated the ability for distinguishing healthy and stressed plants in field images, achieving an average Dice coefficient of 0.74.
arXiv Detail & Related papers (2021-06-14T21:57:40Z) - WheatNet: A Lightweight Convolutional Neural Network for High-throughput
Image-based Wheat Head Detection and Counting [12.735055892742647]
We propose a novel deep learning framework to accurately and efficiently count wheat heads to aid in the gathering of real-time data for decision making.
We call our model WheatNet and show that our approach is robust and accurate for a wide range of environmental conditions of the wheat field.
Our proposed method achieves an MAE and RMSE of 3.85 and 5.19 in our wheat head counting task, respectively, while having significantly fewer parameters when compared to other state-of-the-art methods.
arXiv Detail & Related papers (2021-03-17T02:38:58Z) - Towards an Automatic Analysis of CHO-K1 Suspension Growth in
Microfluidic Single-cell Cultivation [63.94623495501023]
We propose a novel Machine Learning architecture, which allows us to infuse a neural deep network with human-powered abstraction on the level of data.
Specifically, we train a generative model simultaneously on natural and synthetic data, so that it learns a shared representation, from which a target variable, such as the cell count, can be reliably estimated.
arXiv Detail & Related papers (2020-10-20T08:36:51Z) - DeepCorn: A Semi-Supervised Deep Learning Method for High-Throughput
Image-Based Corn Kernel Counting and Yield Estimation [20.829106642703277]
We propose a novel deep learning method for counting on-ear corn kernels in-field to aid in the gathering of real-time data.
DeepCorn estimates the density of corn kernels in an image of corn ears and predicts the number of kernels based on the estimated density map.
Our proposed method achieves the MAE and RMSE of 41.36 and 60.27 in the corn kernel counting task, respectively.
arXiv Detail & Related papers (2020-07-20T23:00:39Z) - Two-View Fine-grained Classification of Plant Species [66.75915278733197]
We propose a novel method based on a two-view leaf image representation and a hierarchical classification strategy for fine-grained recognition of plant species.
A deep metric based on Siamese convolutional neural networks is used to reduce the dependence on a large number of training samples and make the method scalable to new plant species.
arXiv Detail & Related papers (2020-05-18T21:57:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.