Gazebo Plants: Simulating Plant-Robot Interaction with Cosserat Rods
- URL: http://arxiv.org/abs/2402.02570v1
- Date: Sun, 4 Feb 2024 17:19:46 GMT
- Title: Gazebo Plants: Simulating Plant-Robot Interaction with Cosserat Rods
- Authors: Junchen Deng and Samhita Marri and Jonathan Klein and Wojtek
Pa{\l}ubicki and S\"oren Pirk and Girish Chowdhary and Dominik L. Michels
- Abstract summary: We present a plugin for the Gazebo simulation platform based on Cosserat rods to model plant motion.
We demonstrate that, using our plugin, users can conduct harvesting simulations in Gazebo by simulating a robotic arm picking fruits.
- Score: 11.379848739344814
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Robotic harvesting has the potential to positively impact agricultural
productivity, reduce costs, improve food quality, enhance sustainability, and
to address labor shortage. In the rapidly advancing field of agricultural
robotics, the necessity of training robots in a virtual environment has become
essential. Generating training data to automatize the underlying computer
vision tasks such as image segmentation, object detection and classification,
also heavily relies on such virtual environments as synthetic data is often
required to overcome the shortage and lack of variety of real data sets.
However, physics engines commonly employed within the robotics community, such
as ODE, Simbody, Bullet, and DART, primarily support motion and collision
interaction of rigid bodies. This inherent limitation hinders experimentation
and progress in handling non-rigid objects such as plants and crops. In this
contribution, we present a plugin for the Gazebo simulation platform based on
Cosserat rods to model plant motion. It enables the simulation of plants and
their interaction with the environment. We demonstrate that, using our plugin,
users can conduct harvesting simulations in Gazebo by simulating a robotic arm
picking fruits and achieve results comparable to real-world experiments.
Related papers
- SimPRIVE: a Simulation framework for Physical Robot Interaction with Virtual Environments [4.966661313606916]
This paper presents SimPRIVE, a simulation framework for physical robot interaction with virtual environments.
Using SimPRIVE, any physical mobile robot running on ROS 2 can easily be configured to move its digital twin in a virtual world built with the Unreal Engine 5 graphic engine.
The framework has been validated by testing a reinforcement learning agent trained for obstacle avoidance on an AgileX Scout Mini rover.
arXiv Detail & Related papers (2025-04-30T09:22:55Z) - Unreal Robotics Lab: A High-Fidelity Robotics Simulator with Advanced Physics and Rendering [4.760567755149477]
This paper presents a novel simulation framework that integrates the Unreal Engine's advanced rendering capabilities with MuJoCo's high-precision physics simulation.
Our approach enables realistic robotic perception while maintaining accurate physical interactions.
We benchmark visual navigation and SLAM methods within our framework, demonstrating its utility for testing real-world robustness in controlled yet diverse scenarios.
arXiv Detail & Related papers (2025-04-19T01:54:45Z) - Taccel: Scaling Up Vision-based Tactile Robotics via High-performance GPU Simulation [50.34179054785646]
We present Taccel, a high-performance simulation platform that integrates IPC and ABD to model robots, tactile sensors, and objects with both accuracy and unprecedented speed.
Taccel provides precise physics simulation and realistic tactile signals while supporting flexible robot-sensor configurations through user-friendly APIs.
These capabilities position Taccel as a powerful tool for scaling up tactile robotics research and development.
arXiv Detail & Related papers (2025-04-17T12:57:11Z) - Self-Supervised Data Generation for Precision Agriculture: Blending Simulated Environments with Real Imagery [3.9845810840390734]
In precision agriculture, the scarcity of labeled data poses unique challenges for training machine learning models.
We propose a novel system for generating realistic synthetic data to address these challenges.
We demonstrate considerable performance improvements in training a state-of-the-art detector by applying our method to table grapes cultivation.
arXiv Detail & Related papers (2025-02-25T16:13:49Z) - Physical Simulation for Multi-agent Multi-machine Tending [11.017120167486448]
Reinforcement learning (RL) offers a promising solution where robots can learn through interaction with the environment.
We leveraged a simplistic robotic system to work with RL with "real" data without having to deploy large expensive robots in a manufacturing setting.
arXiv Detail & Related papers (2024-10-11T17:57:44Z) - Polaris: Open-ended Interactive Robotic Manipulation via Syn2Real Visual Grounding and Large Language Models [53.22792173053473]
We introduce an interactive robotic manipulation framework called Polaris.
Polaris integrates perception and interaction by utilizing GPT-4 alongside grounded vision models.
We propose a novel Synthetic-to-Real (Syn2Real) pose estimation pipeline.
arXiv Detail & Related papers (2024-08-15T06:40:38Z) - RoboCasa: Large-Scale Simulation of Everyday Tasks for Generalist Robots [25.650235551519952]
We present RoboCasa, a large-scale simulation framework for training generalist robots in everyday environments.
We provide thousands of 3D assets across over 150 object categories and dozens of interactable furniture and appliances.
Our experiments show a clear scaling trend in using synthetically generated robot data for large-scale imitation learning.
arXiv Detail & Related papers (2024-06-04T17:41:31Z) - RoboScript: Code Generation for Free-Form Manipulation Tasks across Real
and Simulation [77.41969287400977]
This paper presents textbfRobotScript, a platform for a deployable robot manipulation pipeline powered by code generation.
We also present a benchmark for a code generation benchmark for robot manipulation tasks in free-form natural language.
We demonstrate the adaptability of our code generation framework across multiple robot embodiments, including the Franka and UR5 robot arms.
arXiv Detail & Related papers (2024-02-22T15:12:00Z) - Learning to navigate efficiently and precisely in real environments [14.52507964172957]
Embodied AI literature focuses on end-to-end agents trained in simulators like Habitat or AI-Thor.
In this work we explore end-to-end training of agents in simulation in settings which minimize the sim2real gap.
arXiv Detail & Related papers (2024-01-25T17:50:05Z) - Learning Human-to-Robot Handovers from Point Clouds [63.18127198174958]
We propose the first framework to learn control policies for vision-based human-to-robot handovers.
We show significant performance gains over baselines on a simulation benchmark, sim-to-sim transfer and sim-to-real transfer.
arXiv Detail & Related papers (2023-03-30T17:58:36Z) - Point Cloud Based Reinforcement Learning for Sim-to-Real and Partial
Observability in Visual Navigation [62.22058066456076]
Reinforcement Learning (RL) represents powerful tools to solve complex robotic tasks.
RL does not work directly in the real-world, which is known as the sim-to-real transfer problem.
We propose a method that learns on an observation space constructed by point clouds and environment randomization.
arXiv Detail & Related papers (2020-07-27T17:46:59Z) - RoboTHOR: An Open Simulation-to-Real Embodied AI Platform [56.50243383294621]
We introduce RoboTHOR to democratize research in interactive and embodied visual AI.
We show there exists a significant gap between the performance of models trained in simulation when they are tested in both simulations and their carefully constructed physical analogs.
arXiv Detail & Related papers (2020-04-14T20:52:49Z) - SAPIEN: A SimulAted Part-based Interactive ENvironment [77.4739790629284]
SAPIEN is a realistic and physics-rich simulated environment that hosts a large-scale set for articulated objects.
We evaluate state-of-the-art vision algorithms for part detection and motion attribute recognition as well as demonstrate robotic interaction tasks.
arXiv Detail & Related papers (2020-03-19T00:11:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.