Position-Agnostic Autonomous Navigation in Vineyards with Deep
Reinforcement Learning
- URL: http://arxiv.org/abs/2206.14155v1
- Date: Tue, 28 Jun 2022 17:03:37 GMT
- Title: Position-Agnostic Autonomous Navigation in Vineyards with Deep
Reinforcement Learning
- Authors: Mauro Martini, Simone Cerrato, Francesco Salvetti, Simone Angarano,
Marcello Chiaberge
- Abstract summary: We propose a cutting-edge lightweight solution to tackle the problem of autonomous vineyard navigation without exploiting precise localization data and overcoming task-tailored algorithms with a flexible learning-based approach.
We train an end-to-end sensorimotor agent which directly maps noisy depth images and position-agnostic robot state information to velocity commands and guides the robot to the end of a row, continuously adjusting its heading for a collision-free central trajectory.
- Score: 1.2599533416395767
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Precision agriculture is rapidly attracting research to efficiently introduce
automation and robotics solutions to support agricultural activities. Robotic
navigation in vineyards and orchards offers competitive advantages in
autonomously monitoring and easily accessing crops for harvesting, spraying and
performing time-consuming necessary tasks. Nowadays, autonomous navigation
algorithms exploit expensive sensors which also require heavy computational
cost for data processing. Nonetheless, vineyard rows represent a challenging
outdoor scenario where GPS and Visual Odometry techniques often struggle to
provide reliable positioning information. In this work, we combine Edge AI with
Deep Reinforcement Learning to propose a cutting-edge lightweight solution to
tackle the problem of autonomous vineyard navigation without exploiting precise
localization data and overcoming task-tailored algorithms with a flexible
learning-based approach. We train an end-to-end sensorimotor agent which
directly maps noisy depth images and position-agnostic robot state information
to velocity commands and guides the robot to the end of a row, continuously
adjusting its heading for a collision-free central trajectory. Our extensive
experimentation in realistic simulated vineyards demonstrates the effectiveness
of our solution and the generalization capabilities of our agent.
Related papers
- Enhancing Navigation Benchmarking and Perception Data Generation for
Row-based Crops in Simulation [0.3518016233072556]
This paper presents a synthetic dataset to train semantic segmentation networks and a collection of virtual scenarios for a fast evaluation of navigation algorithms.
An automatic parametric approach is developed to explore different field geometries and features.
The simulation framework and the dataset have been evaluated by training a deep segmentation network on different crops and benchmarking the resulting navigation.
arXiv Detail & Related papers (2023-06-27T14:46:09Z) - Vision-based Vineyard Navigation Solution with Automatic Annotation [2.6013566739979463]
We introduce a vision-based autonomous navigation framework for agriculture robots in trellised cropping systems such as vineyards.
We propose a novel learning-based method to estimate the path traversibility heatmap directly from an RGB-D image.
A trained path detection model was used to develop a full navigation framework consisting of row tracking and row switching modules.
arXiv Detail & Related papers (2023-03-25T03:37:17Z) - Autonomous Aerial Robot for High-Speed Search and Intercept Applications [86.72321289033562]
A fully-autonomous aerial robot for high-speed object grasping has been proposed.
As an additional sub-task, our system is able to autonomously pierce balloons located in poles close to the surface.
Our approach has been validated in a challenging international competition and has shown outstanding results.
arXiv Detail & Related papers (2021-12-10T11:49:51Z) - A Deep Learning Driven Algorithmic Pipeline for Autonomous Navigation in
Row-Based Crops [38.4971490647654]
We present a complete algorithmic pipeline for row-based crops autonomous navigation, specifically designed to cope with low-range sensors and seasonal variations.
We build on a robust data-driven methodology to generate a viable path for the autonomous machine, covering the full extension of the crop with only the occupancy grid map information of the field.
arXiv Detail & Related papers (2021-12-07T16:46:17Z) - Learning High-Speed Flight in the Wild [101.33104268902208]
We propose an end-to-end approach that can autonomously fly quadrotors through complex natural and man-made environments at high speeds.
The key principle is to directly map noisy sensory observations to collision-free trajectories in a receding-horizon fashion.
By simulating realistic sensor noise, our approach achieves zero-shot transfer from simulation to challenging real-world environments.
arXiv Detail & Related papers (2021-10-11T09:43:11Z) - ReLMM: Practical RL for Learning Mobile Manipulation Skills Using Only
Onboard Sensors [64.2809875343854]
We study how robots can autonomously learn skills that require a combination of navigation and grasping.
Our system, ReLMM, can learn continuously on a real-world platform without any environment instrumentation.
After a grasp curriculum training phase, ReLMM can learn navigation and grasping together fully automatically, in around 40 hours of real-world training.
arXiv Detail & Related papers (2021-07-28T17:59:41Z) - Deep Semantic Segmentation at the Edge for Autonomous Navigation in
Vineyard Rows [0.0]
Precision agriculture aims at introducing affordable and effective automation into agricultural processes.
Our novel proposed control leverages the latest advancement in machine perception and edge AI techniques to achieve highly affordable and reliable navigation inside vineyard rows.
The segmentation maps generated by the control algorithm itself could be directly exploited as filters for a vegetative assessment of the crop status.
arXiv Detail & Related papers (2021-07-01T18:51:58Z) - Simultaneous Navigation and Construction Benchmarking Environments [73.0706832393065]
We need intelligent robots for mobile construction, the process of navigating in an environment and modifying its structure according to a geometric design.
In this task, a major robot vision and learning challenge is how to exactly achieve the design without GPS.
We benchmark the performance of a handcrafted policy with basic localization and planning, and state-of-the-art deep reinforcement learning methods.
arXiv Detail & Related papers (2021-03-31T00:05:54Z) - Task-relevant Representation Learning for Networked Robotic Perception [74.0215744125845]
This paper presents an algorithm to learn task-relevant representations of sensory data that are co-designed with a pre-trained robotic perception model's ultimate objective.
Our algorithm aggressively compresses robotic sensory data by up to 11x more than competing methods.
arXiv Detail & Related papers (2020-11-06T07:39:08Z) - Local Motion Planner for Autonomous Navigation in Vineyards with a RGB-D
Camera-Based Algorithm and Deep Learning Synergy [1.0312968200748118]
This study presents a low-cost local motion planner for autonomous navigation in vineyards.
The first algorithm exploits the disparity map and its depth representation to generate a proportional control for the robotic platform.
A second back-up algorithm, based on representations learning and resilient to illumination variations, can take control of the machine in case of a momentaneous failure of the first block.
arXiv Detail & Related papers (2020-05-26T15:47:42Z) - Deep Learning based Pedestrian Inertial Navigation: Methods, Dataset and
On-Device Inference [49.88536971774444]
Inertial measurements units (IMUs) are small, cheap, energy efficient, and widely employed in smart devices and mobile robots.
Exploiting inertial data for accurate and reliable pedestrian navigation supports is a key component for emerging Internet-of-Things applications and services.
We present and release the Oxford Inertial Odometry dataset (OxIOD), a first-of-its-kind public dataset for deep learning based inertial navigation research.
arXiv Detail & Related papers (2020-01-13T04:41:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.