Detection and Physical Interaction with Deformable Linear Objects
- URL: http://arxiv.org/abs/2205.08041v2
- Date: Sat, 8 Apr 2023 21:18:38 GMT
- Title: Detection and Physical Interaction with Deformable Linear Objects
- Authors: Azarakhsh Keipour, Mohammadreza Mousaei, Maryam Bandari, Stefan
Schaal, Sebastian Scherer
- Abstract summary: Deformable linear objects (e.g., cables, ropes, and threads) commonly appear in our everyday lives.
There have already been successful methods to model and track deformable linear objects.
We present our work on using the method for tasks such as routing and manipulation with the ground and aerial robots.
- Score: 10.707804359932604
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deformable linear objects (e.g., cables, ropes, and threads) commonly appear
in our everyday lives. However, perception of these objects and the study of
physical interaction with them is still a growing area. There have already been
successful methods to model and track deformable linear objects. However, the
number of methods that can automatically extract the initial conditions in
non-trivial situations for these methods has been limited, and they have been
introduced to the community only recently. On the other hand, while physical
interaction with these objects has been done with ground manipulators, there
have not been any studies on physical interaction and manipulation of the
deformable linear object with aerial robots.
This workshop describes our recent work on detecting deformable linear
objects, which uses the segmentation output of the existing methods to provide
the initialization required by the tracking methods automatically. It works
with crossings and can fill the gaps and occlusions in the segmentation and
output the model desirable for physical interaction and simulation. Then we
present our work on using the method for tasks such as routing and manipulation
with the ground and aerial robots. We discuss our feasibility analysis on
extending the physical interaction with these objects to aerial manipulation
applications.
Related papers
- Articulated Object Manipulation using Online Axis Estimation with SAM2-Based Tracking [59.87033229815062]
Articulated object manipulation requires precise object interaction, where the object's axis must be carefully considered.
Previous research employed interactive perception for manipulating articulated objects, but typically, open-loop approaches often suffer from overlooking the interaction dynamics.
We present a closed-loop pipeline integrating interactive perception with online axis estimation from segmented 3D point clouds.
arXiv Detail & Related papers (2024-09-24T17:59:56Z) - Ins-HOI: Instance Aware Human-Object Interactions Recovery [44.02128629239429]
We propose an end-to-end Instance-aware Human-Object Interactions recovery (Ins-HOI) framework.
Ins-HOI supports instance-level reconstruction and provides reasonable and realistic invisible contact surfaces.
We collect a large-scale, high-fidelity 3D scan dataset, including 5.2k high-quality scans with real-world human-chair and hand-object interactions.
arXiv Detail & Related papers (2023-12-15T09:30:47Z) - Learning Extrinsic Dexterity with Parameterized Manipulation Primitives [8.7221770019454]
We learn a sequence of actions that utilize the environment to change the object's pose.
Our approach can control the object's state through exploiting interactions between the object, the gripper, and the environment.
We evaluate our approach on picking box-shaped objects of various weight, shape, and friction properties from a constrained table-top workspace.
arXiv Detail & Related papers (2023-10-26T21:28:23Z) - Grasp Transfer based on Self-Aligning Implicit Representations of Local
Surfaces [10.602143478315861]
This work addresses the problem of transferring a grasp experience or a demonstration to a novel object that shares shape similarities with objects the robot has previously encountered.
We employ a single expert grasp demonstration to learn an implicit local surface representation model from a small dataset of object meshes.
At inference time, this model is used to transfer grasps to novel objects by identifying the most geometrically similar surfaces to the one on which the expert grasp is demonstrated.
arXiv Detail & Related papers (2023-08-15T14:33:17Z) - Efficient Representations of Object Geometry for Reinforcement Learning
of Interactive Grasping Policies [29.998917158604694]
We present a reinforcement learning framework that learns the interactive grasping of various geometrically distinct real-world objects.
Videos of learned interactive policies are available at https://maltemosbach.org/io/geometry_aware_grasping_policies.
arXiv Detail & Related papers (2022-11-20T11:47:33Z) - Discovering Objects that Can Move [55.743225595012966]
We study the problem of object discovery -- separating objects from the background without manual labels.
Existing approaches utilize appearance cues, such as color, texture, and location, to group pixels into object-like regions.
We choose to focus on dynamic objects -- entities that can move independently in the world.
arXiv Detail & Related papers (2022-03-18T21:13:56Z) - Deformable One-Dimensional Object Detection for Routing and Manipulation [8.860083597706502]
This paper proposes an approach for detecting deformable one-dimensional objects which can handle crossings and occlusions.
Our algorithm takes an image containing a deformable object and outputs a chain of fixed-length cylindrical segments connected with passive spherical joints.
Our tests and experiments have shown that the method can correctly detect deformable one-dimensional objects in various complex conditions.
arXiv Detail & Related papers (2022-01-18T07:19:17Z) - A Bayesian Treatment of Real-to-Sim for Deformable Object Manipulation [59.29922697476789]
We propose a novel methodology for extracting state information from image sequences via a technique to represent the state of a deformable object as a distribution embedding.
Our experiments confirm that we can estimate posterior distributions of physical properties, such as elasticity, friction and scale of highly deformable objects, such as cloth and ropes.
arXiv Detail & Related papers (2021-12-09T17:50:54Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Towards unconstrained joint hand-object reconstruction from RGB videos [81.97694449736414]
Reconstructing hand-object manipulations holds a great potential for robotics and learning from human demonstrations.
We first propose a learning-free fitting approach for hand-object reconstruction which can seamlessly handle two-hand object interactions.
arXiv Detail & Related papers (2021-08-16T12:26:34Z) - Occlusion resistant learning of intuitive physics from videos [52.25308231683798]
Key ability for artificial systems is to understand physical interactions between objects, and predict future outcomes of a situation.
This ability, often referred to as intuitive physics, has recently received attention and several methods were proposed to learn these physical rules from video sequences.
arXiv Detail & Related papers (2020-04-30T19:35:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.