Real-time Interface Control with Motion Gesture Recognition based on
Non-contact Capacitive Sensing
- URL: http://arxiv.org/abs/2201.01755v1
- Date: Wed, 5 Jan 2022 18:39:51 GMT
- Title: Real-time Interface Control with Motion Gesture Recognition based on
Non-contact Capacitive Sensing
- Authors: Hunmin Lee, Jaya Krishna Mandivarapu, Nahom Ogbazghi, Yingshu Li
- Abstract summary: We propose a real-time interface control framework based on non-contact hand motion gesture recognition.
We classify the 10 hand motion gesture types with 98.79% accuracy.
This study suggests the feasibility of intuitive interface technology, which accommodates the flexible interaction between human to machine similar to Natural User Interface.
- Score: 2.60215363752507
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Capacitive sensing is a prominent technology that is cost-effective and low
power consuming with fast recognition speed compared to existing sensing
systems. On account of these advantages, Capacitive sensing has been widely
studied and commercialized in the domains of touch sensing, localization,
existence detection, and contact sensing interface application such as
human-computer interaction. However, as a non-contact proximity sensing scheme
is easily affected by the disturbance of peripheral objects or surroundings, it
requires considerable sensitive data processing than contact sensing, limiting
the use of its further utilization. In this paper, we propose a real-time
interface control framework based on non-contact hand motion gesture
recognition through processing the raw signals, detecting the electric field
disturbance triggered by the hand gesture movements near the capacitive sensor
using adaptive threshold, and extracting the significant signal frame, covering
the authentic signal intervals with 98.8% detection rate and 98.4% frame
correction rate. Through the GRU model trained with the extracted signal frame,
we classify the 10 hand motion gesture types with 98.79% accuracy. The
framework transmits the classification result and maneuvers the interface of
the foreground process depending on the input. This study suggests the
feasibility of intuitive interface technology, which accommodates the flexible
interaction between human to machine similar to Natural User Interface, and
uplifts the possibility of commercialization based on measuring the electric
field disturbance through non-contact proximity sensing which is
state-of-the-art sensing technology.
Related papers
- A Real-time Human Pose Estimation Approach for Optimal Sensor Placement
in Sensor-based Human Activity Recognition [63.26015736148707]
This paper introduces a novel methodology to resolve the issue of optimal sensor placement for Human Activity Recognition.
The derived skeleton data provides a unique strategy for identifying the optimal sensor location.
Our findings indicate that the vision-based method for sensor placement offers comparable results to the conventional deep learning approach.
arXiv Detail & Related papers (2023-07-06T10:38:14Z) - Quantum Force Sensing by Digital Twinning of Atomic Bose-Einstein Condensates [2.916921958708415]
We propose a data-driven approach that harnesses the capabilities of machine learning to augment weak-signal detection sensitivity.
In an atomic force sensor, our method combines a digital replica of force-free data with anomaly detection technique.
Our findings demonstrate a significant advancement in sensitivity, achieving an order of magnitude improvement over conventional protocols.
arXiv Detail & Related papers (2023-07-02T06:10:00Z) - Agile gesture recognition for capacitive sensing devices: adapting
on-the-job [55.40855017016652]
We demonstrate a hand gesture recognition system that uses signals from capacitive sensors embedded into the etee hand controller.
The controller generates real-time signals from each of the wearer five fingers.
We use a machine learning technique to analyse the time series signals and identify three features that can represent 5 fingers within 500 ms.
arXiv Detail & Related papers (2023-05-12T17:24:02Z) - Recognizing Complex Gestures on Minimalistic Knitted Sensors: Toward
Real-World Interactive Systems [0.13048920509133805]
Our digitally-knitted capacitive active sensors can be manufactured at scale with little human intervention.
This work advances the capabilities of such sensors by creating the foundation for an interactive gesture recognition system.
arXiv Detail & Related papers (2023-03-18T04:57:46Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Learning to Detect Slip with Barometric Tactile Sensors and a Temporal
Convolutional Neural Network [7.346580429118843]
We present a learning-based method to detect slip using barometric tactile sensors.
We train a temporal convolution neural network to detect slip, achieving high detection accuracies.
We argue that barometric tactile sensing technology, combined with data-driven learning, is suitable for many manipulation tasks such as slip compensation.
arXiv Detail & Related papers (2022-02-19T08:21:56Z) - Towards Domain-Independent and Real-Time Gesture Recognition Using
mmWave Signal [11.76969975145963]
DI-Gesture is a domain-independent and real-time mmWave gesture recognition system.
In real-time scenario, the accuracy of DI-Gesutre reaches over 97% with average inference time of 2.87ms.
arXiv Detail & Related papers (2021-11-11T13:28:28Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - AuraSense: Robot Collision Avoidance by Full Surface Proximity Detection [3.9770080498150224]
AuraSense is the first system to realize no-dead-spot proximity sensing for robot arms.
It requires only a single pair of piezoelectric transducers, and can easily be applied to off-the-shelf robots.
arXiv Detail & Related papers (2021-08-10T18:37:54Z) - Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision
Action Recognition [131.6328804788164]
We propose a framework, named Semantics-aware Adaptive Knowledge Distillation Networks (SAKDN), to enhance action recognition in vision-sensor modality (videos)
The SAKDN uses multiple wearable-sensors as teacher modalities and uses RGB videos as student modality.
arXiv Detail & Related papers (2020-09-01T03:38:31Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.