Optical skin: Sensor-integration-free multimodal flexible sensing
- URL: http://arxiv.org/abs/2202.03189v1
- Date: Thu, 3 Feb 2022 14:58:27 GMT
- Title: Optical skin: Sensor-integration-free multimodal flexible sensing
- Authors: Sho Shimadera, Kei Kitagawa, Koyo Sagehashi, Tomoaki Niiyama, and
Satoshi Sunada
- Abstract summary: We propose a simple, highly sensitive, and multimodal sensing approach, which does not require integrating multiple sensors.
The proposed approach is based on an optical interference technique, which can encode the information of various stimuli as a spatial pattern.
We present a haptic soft device for a human-machine interface.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The biological skin enables animals to sense various stimuli. Extensive
efforts have been made recently to develop smart skin-like sensors to extend
the capabilities of biological skins; however, simultaneous sensing of several
types of stimuli in a large area remains challenging because this requires
large-scale sensor integration with numerous wire connections. We propose a
simple, highly sensitive, and multimodal sensing approach, which does not
require integrating multiple sensors. The proposed approach is based on an
optical interference technique, which can encode the information of various
stimuli as a spatial pattern. In contrast to the existing approach, the
proposed approach, combined with a deep neural network, enables us to freely
select the sensing mode according to our purpose. As a key example, we
demonstrate simultaneous sensing mode of three different physical quantities,
contact force, contact location, and temperature, using a single soft material
without requiring complex integration. Another unique property of the proposed
approach is spatially continuous sensing with ultrahigh resolution of few tens
of micrometers, which enables identifying the shape of the object in contact.
Furthermore, we present a haptic soft device for a human-machine interface. The
proposed approach encourages the development of high-performance optical skins.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Condition-Aware Multimodal Fusion for Robust Semantic Perception of Driving Scenes [56.52618054240197]
We propose a novel, condition-aware multimodal fusion approach for robust semantic perception of driving scenes.
Our method, CAFuser, uses an RGB camera input to classify environmental conditions and generate a Condition Token that guides the fusion of multiple sensor modalities.
We set the new state of the art with CAFuser on the MUSES dataset with 59.7 PQ for multimodal panoptic segmentation and 78.2 mIoU for semantic segmentation, ranking first on the public benchmarks.
arXiv Detail & Related papers (2024-10-14T17:56:20Z) - Bridging Remote Sensors with Multisensor Geospatial Foundation Models [15.289711240431107]
msGFM is a multisensor geospatial foundation model that unifies data from four key sensor modalities.
For data originating from identical geolocations, our model employs an innovative cross-sensor pretraining approach.
msGFM has demonstrated enhanced proficiency in a range of both single-sensor and multisensor downstream tasks.
arXiv Detail & Related papers (2024-04-01T17:30:56Z) - Proprioceptive Learning with Soft Polyhedral Networks [16.188789266592032]
Proprioception is the "sixth sense" that detects limb postures with motor neurons.
Here, we present the Soft Polyhedral Network with an embedded vision for physical interactions.
This design enables passive adaptations to omni-directional interactions, visually captured by a miniature high-speed motion tracking system.
arXiv Detail & Related papers (2023-08-16T17:53:40Z) - Recognizing Complex Gestures on Minimalistic Knitted Sensors: Toward
Real-World Interactive Systems [0.13048920509133805]
Our digitally-knitted capacitive active sensors can be manufactured at scale with little human intervention.
This work advances the capabilities of such sensors by creating the foundation for an interactive gesture recognition system.
arXiv Detail & Related papers (2023-03-18T04:57:46Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Learning Online Multi-Sensor Depth Fusion [100.84519175539378]
SenFuNet is a depth fusion approach that learns sensor-specific noise and outlier statistics.
We conduct experiments with various sensor combinations on the real-world CoRBS and Scene3D datasets.
arXiv Detail & Related papers (2022-04-07T10:45:32Z) - ReSkin: versatile, replaceable, lasting tactile skins [28.348982687106883]
ReSkin is a tactile soft sensor that leverages machine learning and magnetic sensing to offer a low-cost, diverse and compact solution for long-term use.
Our self-supervised learning algorithm enables finer performance enhancement with small, inexpensive data collection procedures.
arXiv Detail & Related papers (2021-10-29T20:21:37Z) - Neural Dependency Coding inspired Multimodal Fusion [11.182263394122142]
Recent work in deep fusion models via neural networks has led to substantial improvements in areas like speech recognition, emotion recognition and analysis, captioning and image description.
Inspired by recent neuroscience ideas about multisensory integration and processing, we investigate the effect of synergy maximizing loss functions.
arXiv Detail & Related papers (2021-09-28T17:52:09Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.