Exploring the Impact of Synthetic Data on Human Gesture Recognition Tasks Using GANs
- URL: http://arxiv.org/abs/2412.06389v1
- Date: Mon, 09 Dec 2024 11:15:47 GMT
- Title: Exploring the Impact of Synthetic Data on Human Gesture Recognition Tasks Using GANs
- Authors: George Kontogiannis, Pantelis Tzamalis, Sotiris Nikoletseas,
- Abstract summary: This study is the first to explore the feasibility of synthesizing motion gestures for allergic rhinitis from wearable IoT device data using Generative Adversarial Networks (GANs)<n>We also focus on these AI models' performance in terms of fidelity, diversity, and privacy.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the evolving domain of Human Activity Recognition (HAR) using Internet of Things (IoT) devices, there is an emerging interest in employing Deep Generative Models (DGMs) to address data scarcity, enhance data quality, and improve classification metrics scores. Among these types of models, Generative Adversarial Networks (GANs) have arisen as a powerful tool for generating synthetic data that mimic real-world scenarios with high fidelity. However, Human Gesture Recognition (HGR), a subset of HAR, particularly in healthcare applications, using time series data such as allergic gestures, remains highly unexplored. In this paper, we examine and evaluate the performance of two GANs in the generation of synthetic gesture motion data that compose a part of an open-source benchmark dataset. The data is related to the disease identification domain and healthcare, specifically to allergic rhinitis. We also focus on these AI models' performance in terms of fidelity, diversity, and privacy. Furthermore, we examine the scenario if the synthetic data can substitute real data, in training scenarios and how well models trained on synthetic data can be generalized for the allergic rhinitis gestures. In our work, these gestures are related to 6-axes accelerometer and gyroscope data, serving as multi-variate time series instances, and retrieved from smart wearable devices. To the best of our knowledge, this study is the first to explore the feasibility of synthesizing motion gestures for allergic rhinitis from wearable IoT device data using Generative Adversarial Networks (GANs) and testing their impact on the generalization of gesture recognition systems. It is worth noting that, even if our method has been applied to a specific category of gestures, it is designed to be generalized and can be deployed also to other motion data in the HGR domain.
Related papers
- Synthetic Data Generation of Body Motion Data by Neural Gas Network for Emotion Recognition [0.9790236766474201]
This research introduces a novel application of the Neural Gas Network (NGN) algorithm for synthesizing body motion data.
By learning skeletal structure topology, the NGN fits the neurons or gas particles on body joints.
By attaching body postures over frames, the final synthetic body motion appears.
arXiv Detail & Related papers (2025-03-11T13:16:30Z) - OS-Genesis: Automating GUI Agent Trajectory Construction via Reverse Task Synthesis [55.390060529534644]
We propose OS-Genesis, a novel data synthesis pipeline for Graphical User Interface (GUI) agents.
Instead of relying on pre-defined tasks, OS-Genesis enables agents first to perceive environments and perform step-wise interactions.
We demonstrate that training GUI agents with OS-Genesis significantly improves their performance on highly challenging online benchmarks.
arXiv Detail & Related papers (2024-12-27T16:21:58Z) - Second FRCSyn-onGoing: Winning Solutions and Post-Challenge Analysis to Improve Face Recognition with Synthetic Data [104.30479583607918]
2nd FRCSyn-onGoing challenge is based on the 2nd Face Recognition Challenge in the Era of Synthetic Data (FRCSyn), originally launched at CVPR 2024.<n>We focus on exploring the use of synthetic data both individually and in combination with real data to solve current challenges in face recognition.
arXiv Detail & Related papers (2024-12-02T11:12:01Z) - emg2qwerty: A Large Dataset with Baselines for Touch Typing using Surface Electromyography [47.160223334501126]
emg2qwerty is a large-scale dataset of non-invasive electromyographic signals recorded at the wrists while touch typing on a QWERTY keyboard.
With 1,135 sessions spanning 108 users and 346 hours of recording, this is the largest such public dataset to date.
We show strong baseline performance on predicting key-presses using sEMG signals alone.
arXiv Detail & Related papers (2024-10-26T05:18:48Z) - Scaling Wearable Foundation Models [54.93979158708164]
We investigate the scaling properties of sensor foundation models across compute, data, and model size.
Using a dataset of up to 40 million hours of in-situ heart rate, heart rate variability, electrodermal activity, accelerometer, skin temperature, and altimeter per-minute data from over 165,000 people, we create LSM.
Our results establish the scaling laws of LSM for tasks such as imputation, extrapolation, both across time and sensor modalities.
arXiv Detail & Related papers (2024-10-17T15:08:21Z) - Synthetic-to-Real Domain Adaptation for Action Recognition: A Dataset and Baseline Performances [76.34037366117234]
We introduce a new dataset called Robot Control Gestures (RoCoG-v2)
The dataset is composed of both real and synthetic videos from seven gesture classes.
We present results using state-of-the-art action recognition and domain adaptation algorithms.
arXiv Detail & Related papers (2023-03-17T23:23:55Z) - cGAN-Based High Dimensional IMU Sensor Data Generation for Enhanced
Human Activity Recognition in Therapeutic Activities [0.0]
A novel GAN network called TheraGAN was developed to generate IMU signals associated with rehabilitation activities.
The generated signals closely mimicked the real signals, and adding generated data resulted in a significant improvement in the performance of all tested networks.
arXiv Detail & Related papers (2023-02-16T00:08:28Z) - PhysioGAN: Training High Fidelity Generative Model for Physiological
Sensor Readings [6.029263679246354]
We present PHYSIOGAN, a generative model to produce high fidelity synthetic physiological sensor data readings.
We evaluate it against the state-of-the-art techniques using two different real-world datasets: ECG classification and activity recognition from motion sensors datasets.
arXiv Detail & Related papers (2022-04-25T07:38:43Z) - Transformer Networks for Data Augmentation of Human Physical Activity
Recognition [61.303828551910634]
State of the art models like Recurrent Generative Adrial Networks (RGAN) are used to generate realistic synthetic data.
In this paper, transformer based generative adversarial networks which have global attention on data, are compared on PAMAP2 and Real World Human Activity Recognition data sets with RGAN.
arXiv Detail & Related papers (2021-09-02T16:47:29Z) - CorGAN: Correlation-Capturing Convolutional Generative Adversarial
Networks for Generating Synthetic Healthcare Records [0.0]
We propose a framework called correlation-capturing Generative Adversarial Network (CorGAN) to generate synthetic healthcare records.
To demonstrate the model fidelity, we show that CorGAN generates synthetic data with performance similar to that of real data in various Machine Learning settings.
arXiv Detail & Related papers (2020-01-25T18:43:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.