Bayesian and Neural Inference on LSTM-based Object Recognition from
Tactile and Kinesthetic Information
- URL: http://arxiv.org/abs/2306.06423v1
- Date: Sat, 10 Jun 2023 12:29:23 GMT
- Title: Bayesian and Neural Inference on LSTM-based Object Recognition from
Tactile and Kinesthetic Information
- Authors: Francisco Pastor (1), Jorge Garc\'ia-Gonz\'alez (2), Juan M. Gandarias
(1), Daniel Medina (3), Pau Closas (4), Alfonso J. Garc\'ia-Cerezo (1),
Jes\'us M. G\'omez-de-Gabriel (1) ((1) Robotics and Mechatronics Group,
University of Malaga, Spain, (2) Department of Computer Languages and
Computer Science, University of Malaga, Spain, (3) Institute of
Communications and Navigation, German Aerospace Center (DLR), Germany, (4)
Department of Electrical and Computer Engineering, Northeastern University,
Boston, USA)
- Abstract summary: Haptic perception encompasses the sensing modalities encountered in the sense of touch (e.g., tactile and kinesthetic sensations)
This letter focuses on multimodal object recognition and proposes analytical and data-driven methodologies to fuse tactile- and kinesthetic-based classification results.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances in the field of intelligent robotic manipulation pursue
providing robotic hands with touch sensitivity. Haptic perception encompasses
the sensing modalities encountered in the sense of touch (e.g., tactile and
kinesthetic sensations). This letter focuses on multimodal object recognition
and proposes analytical and data-driven methodologies to fuse tactile- and
kinesthetic-based classification results. The procedure is as follows: a
three-finger actuated gripper with an integrated high-resolution tactile sensor
performs squeeze-and-release Exploratory Procedures (EPs). The tactile images
and kinesthetic information acquired using angular sensors on the finger joints
constitute the time-series datasets of interest. Each temporal dataset is fed
to a Long Short-term Memory (LSTM) Neural Network, which is trained to classify
in-hand objects. The LSTMs provide an estimation of the posterior probability
of each object given the corresponding measurements, which after fusion allows
to estimate the object through Bayesian and Neural inference approaches. An
experiment with 36-classes is carried out to evaluate and compare the
performance of the fused, tactile, and kinesthetic perception systems.The
results show that the Bayesian-based classifiers improves capabilities for
object recognition and outperforms the Neural-based approach.
Related papers
- An Empirical Evaluation of Neural and Neuro-symbolic Approaches to
Real-time Multimodal Complex Event Detection [5.803352384948482]
Traditional end-to-end neural architectures struggle with long-duration events due to limited context sizes and reasoning capabilities.
Recent advances in neuro-symbolic methods, which integrate neural and symbolic models leveraging human knowledge, promise improved performance with less data.
This study addresses the gap in understanding these approaches' effectiveness in complex event detection (CED)
We investigate neural and neuro-symbolic architectures' performance in a multimodal CED task, analyzing IMU and acoustic data streams to recognize CE patterns.
arXiv Detail & Related papers (2024-02-17T23:34:50Z) - Data-Driven Goal Recognition in Transhumeral Prostheses Using Process
Mining Techniques [7.95507524742396]
Active prostheses utilize real-valued, continuous sensor data to recognize patient target poses, or goals, and proactively move the artificial limb.
Previous studies have examined how well the data collected in stationary poses, without considering the time steps, can help discriminate the goals.
Our approach involves transforming the data into discrete events and training an existing process mining-based goal recognition system.
arXiv Detail & Related papers (2023-09-15T02:03:59Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Learning to Detect Slip through Tactile Estimation of the Contact Force Field and its Entropy [6.739132519488627]
We introduce a physics-informed, data-driven approach to detect slip continuously in real time.
We employ the GelSight Mini, an optical tactile sensor, attached to custom-designed grippers to gather tactile data.
Our results show that the best classification algorithm achieves a high average accuracy of 95.61%.
arXiv Detail & Related papers (2023-03-02T03:16:21Z) - AGO-Net: Association-Guided 3D Point Cloud Object Detection Network [86.10213302724085]
We propose a novel 3D detection framework that associates intact features for objects via domain adaptation.
We achieve new state-of-the-art performance on the KITTI 3D detection benchmark in both accuracy and speed.
arXiv Detail & Related papers (2022-08-24T16:54:38Z) - Object recognition for robotics from tactile time series data utilising
different neural network architectures [0.0]
This paper investigates the use of Convolutional Neural Networks (CNN) and Long-Short Term Memory (LSTM) neural network architectures for object classification on tactile data.
We compare these methods using data from two different fingertip sensors (namely the BioTac SP and WTS-FT) in the same physical setup.
The results show that the proposed method improves the maximum accuracy from 82.4% (BioTac SP fingertips) and 90.7% (WTS-FT fingertips) with complete time-series data to about 94% for both sensor types.
arXiv Detail & Related papers (2021-09-09T22:05:45Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Relational Graph Learning on Visual and Kinematics Embeddings for
Accurate Gesture Recognition in Robotic Surgery [84.73764603474413]
We propose a novel online approach of multi-modal graph network (i.e., MRG-Net) to dynamically integrate visual and kinematics information.
The effectiveness of our method is demonstrated with state-of-the-art results on the public JIGSAWS dataset.
arXiv Detail & Related papers (2020-11-03T11:00:10Z) - Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision
Action Recognition [131.6328804788164]
We propose a framework, named Semantics-aware Adaptive Knowledge Distillation Networks (SAKDN), to enhance action recognition in vision-sensor modality (videos)
The SAKDN uses multiple wearable-sensors as teacher modalities and uses RGB videos as student modality.
arXiv Detail & Related papers (2020-09-01T03:38:31Z) - Gesture Recognition from Skeleton Data for Intuitive Human-Machine
Interaction [0.6875312133832077]
We propose an approach for segmentation and classification of dynamic gestures based on a set of handcrafted features.
The method for gesture recognition applies a sliding window, which extracts information from both the spatial and temporal dimensions.
At the end, the recognized gestures are used to interact with a collaborative robot.
arXiv Detail & Related papers (2020-08-26T11:28:50Z) - Continuous Emotion Recognition via Deep Convolutional Autoencoder and
Support Vector Regressor [70.2226417364135]
It is crucial that the machine should be able to recognize the emotional state of the user with high accuracy.
Deep neural networks have been used with great success in recognizing emotions.
We present a new model for continuous emotion recognition based on facial expression recognition.
arXiv Detail & Related papers (2020-01-31T17:47:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.