Synthesis and Execution of Communicative Robotic Movements with
Generative Adversarial Networks
- URL: http://arxiv.org/abs/2203.15640v2
- Date: Thu, 31 Mar 2022 11:22:50 GMT
- Title: Synthesis and Execution of Communicative Robotic Movements with
Generative Adversarial Networks
- Authors: Luca Garello, Linda Lastrico, Alessandra Sciutti, Nicoletta Noceti,
Fulvio Mastrogiovanni and Francesco Rea
- Abstract summary: We focus on how to transfer on two different robotic platforms the same kinematics modulation that humans adopt when manipulating delicate objects.
We choose to modulate the velocity profile adopted by the robots' end-effector, inspired by what humans do when transporting objects with different characteristics.
We exploit a novel Generative Adversarial Network architecture, trained with human kinematics examples, to generalize over them and generate new and meaningful velocity profiles.
- Score: 59.098560311521034
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Object manipulation is a natural activity we perform every day. How humans
handle objects can communicate not only the willfulness of the acting, or key
aspects of the context where we operate, but also the properties of the objects
involved, without any need for explicit verbal description. Since human
intelligence comprises the ability to read the context, allowing robots to
perform actions that intuitively convey this kind of information would greatly
facilitate collaboration. In this work, we focus on how to transfer on two
different robotic platforms the same kinematics modulation that humans adopt
when manipulating delicate objects, aiming to endow robots with the capability
to show carefulness in their movements. We choose to modulate the velocity
profile adopted by the robots' end-effector, inspired by what humans do when
transporting objects with different characteristics. We exploit a novel
Generative Adversarial Network architecture, trained with human kinematics
examples, to generalize over them and generate new and meaningful velocity
profiles, either associated with careful or not careful attitudes. This
approach would allow next generation robots to select the most appropriate
style of movement, depending on the perceived context, and autonomously
generate their motor action execution.
Related papers
- EMOTION: Expressive Motion Sequence Generation for Humanoid Robots with In-Context Learning [10.266351600604612]
This paper introduces a framework, called EMOTION, for generating expressive motion sequences in humanoid robots.
We conduct online user studies comparing the naturalness and understandability of the motions generated by EMOTION and its human-feedback version, EMOTION++.
arXiv Detail & Related papers (2024-10-30T17:22:45Z) - Built Different: Tactile Perception to Overcome Cross-Embodiment Capability Differences in Collaborative Manipulation [1.9048510647598207]
Tactile sensing is a powerful means of implicit communication between a human and a robot assistant.
In this paper, we investigate how tactile sensing can transcend cross-embodiment differences across robotic systems.
We show how our method can enable a cooperative task where a robot and human must work together to maneuver objects through space.
arXiv Detail & Related papers (2024-09-23T10:45:41Z) - Zero-Shot Robot Manipulation from Passive Human Videos [59.193076151832145]
We develop a framework for extracting agent-agnostic action representations from human videos.
Our framework is based on predicting plausible human hand trajectories.
We deploy the trained model zero-shot for physical robot manipulation tasks.
arXiv Detail & Related papers (2023-02-03T21:39:52Z) - IMoS: Intent-Driven Full-Body Motion Synthesis for Human-Object
Interactions [69.95820880360345]
We present the first framework to synthesize the full-body motion of virtual human characters with 3D objects placed within their reach.
Our system takes as input textual instructions specifying the objects and the associated intentions of the virtual characters.
We show that our synthesized full-body motions appear more realistic to the participants in more than 80% of scenarios.
arXiv Detail & Related papers (2022-12-14T23:59:24Z) - Robots with Different Embodiments Can Express and Influence Carefulness
in Object Manipulation [104.5440430194206]
This work investigates the perception of object manipulations performed with a communicative intent by two robots.
We designed the robots' movements to communicate carefulness or not during the transportation of objects.
arXiv Detail & Related papers (2022-08-03T13:26:52Z) - Property-Aware Robot Object Manipulation: a Generative Approach [57.70237375696411]
In this work, we focus on how to generate robot motion adapted to the hidden properties of the manipulated objects.
We explore the possibility of leveraging Generative Adversarial Networks to synthesize new actions coherent with the properties of the object.
Our results show that Generative Adversarial Nets can be a powerful tool for the generation of novel and meaningful transportation actions.
arXiv Detail & Related papers (2021-06-08T14:15:36Z) - Learning Bipedal Robot Locomotion from Human Movement [0.791553652441325]
We present a reinforcement learning based method for teaching a real world bipedal robot to perform movements directly from motion capture data.
Our method seamlessly transitions from training in a simulation environment to executing on a physical robot.
We demonstrate our method on an internally developed humanoid robot with movements ranging from a dynamic walk cycle to complex balancing and waving.
arXiv Detail & Related papers (2021-05-26T00:49:37Z) - Careful with That! Observation of Human Movements to Estimate Objects
Properties [106.925705883949]
We focus on the features of human motor actions that communicate insights on the weight of an object.
Our final goal is to enable a robot to autonomously infer the degree of care required in object handling.
arXiv Detail & Related papers (2021-03-02T08:14:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.