Fine Robotic Manipulation without Force/Torque Sensor
- URL: http://arxiv.org/abs/2301.13413v2
- Date: Tue, 5 Mar 2024 08:17:05 GMT
- Title: Fine Robotic Manipulation without Force/Torque Sensor
- Authors: Shilin Shan, Quang-Cuong Pham
- Abstract summary: A typical 6-axis Force/Torque (F/T) sensor is mounted between the robot's wrist and the end-effector in order to measure the forces exerted by the environment onto the robot (the external wrench)
Although a typical 6-axis F/T sensor can provide highly accurate measurements, it is expensive and vulnerable to drift and external impacts.
Existing methods aiming at estimating the external wrench using only the robot's internal signals are limited in scope.
We present a Neural Network based method and argue that by devoting particular attention to the training data structure, it is possible to accurately estimate the
- Score: 9.908082209033612
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Force Sensing and Force Control are essential to many industrial
applications. Typically, a 6-axis Force/Torque (F/T) sensor is mounted between
the robot's wrist and the end-effector in order to measure the forces and
torques exerted by the environment onto the robot (the external wrench).
Although a typical 6-axis F/T sensor can provide highly accurate measurements,
it is expensive and vulnerable to drift and external impacts. Existing methods
aiming at estimating the external wrench using only the robot's internal
signals are limited in scope: for example, wrench estimation accuracy was
mostly validated in free-space motions and simple contacts as opposed to tasks
like assembly that require high-precision force control. Here we present a
Neural Network based method and argue that by devoting particular attention to
the training data structure, it is possible to accurately estimate the external
wrench in a wide range of scenarios based solely on internal signals. As an
illustration, we demonstrate a pin insertion experiment with 100-micron
clearance and a hand-guiding experiment, both performed without external F/T
sensors or joint torque sensors. Our result opens the possibility of equipping
the existing 2.7 million industrial robots with Force Sensing and Force Control
capabilities without any additional hardware.
Related papers
- Taccel: Scaling Up Vision-based Tactile Robotics via High-performance GPU Simulation [50.34179054785646]
We present Taccel, a high-performance simulation platform that integrates IPC and ABD to model robots, tactile sensors, and objects with both accuracy and unprecedented speed.
Taccel provides precise physics simulation and realistic tactile signals while supporting flexible robot-sensor configurations through user-friendly APIs.
These capabilities position Taccel as a powerful tool for scaling up tactile robotics research and development.
arXiv Detail & Related papers (2025-04-17T12:57:11Z) - Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Sensor Deprivation Attacks for Stealthy UAV Manipulation [51.9034385791934]
Unmanned Aerial Vehicles autonomously perform tasks with the use of state-of-the-art control algorithms.
In this work, we propose a multi-part.
Sensor Deprivation Attacks (SDAs), aiming to stealthily impact.
process control via sensor reconfiguration.
arXiv Detail & Related papers (2024-10-14T23:03:58Z) - Proprioceptive External Torque Learning for Floating Base Robot and its
Applications to Humanoid Locomotion [17.384713355349476]
This paper introduces a method for learning external joint torque solely using proprioceptive sensors (encoders and IMUs) for a floating base robot.
Real robot experiments demonstrate that the network can estimate the external torque and contact wrench with significantly smaller errors.
The study also validates that the estimated contact wrench can be utilized for zero moment point (ZMP) feedback control.
arXiv Detail & Related papers (2023-09-08T05:33:56Z) - UltraGlove: Hand Pose Estimation with Mems-Ultrasonic Sensors [14.257535961674021]
We propose a novel and low-cost hand-tracking glove that utilizes several MEMS-ultrasonic sensors attached to the fingers.
Our experimental results demonstrate that this approach is both accurate, size-agnostic, and robust to external interference.
arXiv Detail & Related papers (2023-06-22T03:41:47Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - 6N-DoF Pose Tracking for Tensegrity Robots [5.398092221687385]
Tensegrity robots are composed of rigid compressive elements (rods) and flexible tensile elements (e.g., cables)
This work aims to address the pose tracking of tensegrity robots through a markerless, vision-based method.
An iterative optimization process is proposed to estimate the 6-DoF poses of each rigid element of a tensegrity robot from an RGB-D video.
arXiv Detail & Related papers (2022-05-29T20:55:29Z) - Bayesian Imitation Learning for End-to-End Mobile Manipulation [80.47771322489422]
Augmenting policies with additional sensor inputs, such as RGB + depth cameras, is a straightforward approach to improving robot perception capabilities.
We show that using the Variational Information Bottleneck to regularize convolutional neural networks improves generalization to held-out domains.
We demonstrate that our method is able to help close the sim-to-real gap and successfully fuse RGB and depth modalities.
arXiv Detail & Related papers (2022-02-15T17:38:30Z) - AuraSense: Robot Collision Avoidance by Full Surface Proximity Detection [3.9770080498150224]
AuraSense is the first system to realize no-dead-spot proximity sensing for robot arms.
It requires only a single pair of piezoelectric transducers, and can easily be applied to off-the-shelf robots.
arXiv Detail & Related papers (2021-08-10T18:37:54Z) - Domain and Modality Gaps for LiDAR-based Person Detection on Mobile
Robots [91.01747068273666]
This paper studies existing LiDAR-based person detectors with a particular focus on mobile robot scenarios.
Experiments revolve around the domain gap between driving and mobile robot scenarios, as well as the modality gap between 3D and 2D LiDAR sensors.
Results provide practical insights into LiDAR-based person detection and facilitate informed decisions for relevant mobile robot designs and applications.
arXiv Detail & Related papers (2021-06-21T16:35:49Z) - DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile
Sensor with Application to In-Hand Manipulation [16.54834671357377]
General purpose in-hand manipulation remains one of the unsolved challenges of robotics.
We introduce DIGIT, an inexpensive, compact, and high-resolution tactile sensor geared towards in-hand manipulation.
arXiv Detail & Related papers (2020-05-29T17:07:54Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.