Physical Activity Recognition by Utilising Smartphone Sensor Signals
- URL: http://arxiv.org/abs/2201.08688v1
- Date: Thu, 20 Jan 2022 09:58:52 GMT
- Title: Physical Activity Recognition by Utilising Smartphone Sensor Signals
- Authors: Abdulrahman Alruban, Hind Alobaidi, Nathan Clarke' Fudong Li
- Abstract summary: This study collected human activity data from 60 participants across two different days for a total of six activities recorded by gyroscope and accelerometer sensors in a modern smartphone.
The proposed approach achieved a classification accuracy of 98 percent in identifying four different activities.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Human physical motion activity identification has many potential applications
in various fields, such as medical diagnosis, military sensing, sports
analysis, and human-computer security interaction. With the recent advances in
smartphones and wearable technologies, it has become common for such devices to
have embedded motion sensors that are able to sense even small body movements.
This study collected human activity data from 60 participants across two
different days for a total of six activities recorded by gyroscope and
accelerometer sensors in a modern smartphone. The paper investigates to what
extent different activities can be identified by utilising machine learning
algorithms using approaches such as majority algorithmic voting. More analyses
are also provided that reveal which time and frequency domain based features
were best able to identify individuals motion activity types. Overall, the
proposed approach achieved a classification accuracy of 98 percent in
identifying four different activities: walking, walking upstairs, walking
downstairs, and sitting while the subject is calm and doing a typical
desk-based activity.
Related papers
- Human Activity Recognition using Smartphones [0.0]
We have created an Android application that recognizes the daily human activities and calculate the calories burnt in real time.
This is used for real-time activity recognition and calculation of calories burnt using a formula based on Metabolic Equivalent.
arXiv Detail & Related papers (2024-04-03T17:05:41Z) - A Real-time Human Pose Estimation Approach for Optimal Sensor Placement
in Sensor-based Human Activity Recognition [63.26015736148707]
This paper introduces a novel methodology to resolve the issue of optimal sensor placement for Human Activity Recognition.
The derived skeleton data provides a unique strategy for identifying the optimal sensor location.
Our findings indicate that the vision-based method for sensor placement offers comparable results to the conventional deep learning approach.
arXiv Detail & Related papers (2023-07-06T10:38:14Z) - Multi-Channel Time-Series Person and Soft-Biometric Identification [65.83256210066787]
This work investigates person and soft-biometrics identification from recordings of humans performing different activities using deep architectures.
We evaluate the method on four datasets of multi-channel time-series human activity recognition (HAR)
Soft-biometric based attribute representation shows promising results and emphasis the necessity of larger datasets.
arXiv Detail & Related papers (2023-04-04T07:24:51Z) - Your Day in Your Pocket: Complex Activity Recognition from Smartphone
Accelerometers [7.335712499936904]
This paper investigates the recognition of complex activities exclusively using smartphone accelerometer data.
We used a large smartphone sensing dataset collected from over 600 users in five countries during the pandemic.
Deep learning-based, binary classification of eight complex activities can be achieved with AUROC scores up to 0.76 with partially personalized models.
arXiv Detail & Related papers (2023-01-17T16:22:30Z) - Co-Located Human-Human Interaction Analysis using Nonverbal Cues: A
Survey [71.43956423427397]
We aim to identify the nonverbal cues and computational methodologies resulting in effective performance.
This survey differs from its counterparts by involving the widest spectrum of social phenomena and interaction settings.
Some major observations are: the most often used nonverbal cue, computational method, interaction environment, and sensing approach are speaking activity, support vector machines, and meetings composed of 3-4 persons equipped with microphones and cameras, respectively.
arXiv Detail & Related papers (2022-07-20T13:37:57Z) - Human Activity Recognition on Time Series Accelerometer Sensor Data
using LSTM Recurrent Neural Networks [0.2294014185517203]
In this study, we focus on the use of smartwatch accelerometer sensors to recognize eating activity.
We collected sensor data from 10 participants while consuming pizza.
We developed a LSTM-ANN architecture that has demonstrated 90% success in identifying individual bites compared to a puff, medication-taking or jogging activities.
arXiv Detail & Related papers (2022-06-03T19:24:20Z) - Mobile Behavioral Biometrics for Passive Authentication [65.94403066225384]
This work carries out a comparative analysis of unimodal and multimodal behavioral biometric traits.
Experiments are performed over HuMIdb, one of the largest and most comprehensive freely available mobile user interaction databases.
In our experiments, the most discriminative background sensor is the magnetometer, whereas among touch tasks the best results are achieved with keystroke.
arXiv Detail & Related papers (2022-03-14T17:05:59Z) - Classifying Human Activities with Inertial Sensors: A Machine Learning
Approach [0.0]
Human Activity Recognition (HAR) is an ongoing research topic.
It has applications in medical support, sports, fitness, social networking, human-computer interfaces, senior care, entertainment, surveillance, and the list goes on.
We examined and analyzed different Machine Learning and Deep Learning approaches for Human Activity Recognition using inertial sensor data of smartphones.
arXiv Detail & Related papers (2021-11-09T08:17:33Z) - Domain and Modality Gaps for LiDAR-based Person Detection on Mobile
Robots [91.01747068273666]
This paper studies existing LiDAR-based person detectors with a particular focus on mobile robot scenarios.
Experiments revolve around the domain gap between driving and mobile robot scenarios, as well as the modality gap between 3D and 2D LiDAR sensors.
Results provide practical insights into LiDAR-based person detection and facilitate informed decisions for relevant mobile robot designs and applications.
arXiv Detail & Related papers (2021-06-21T16:35:49Z) - SensiX: A Platform for Collaborative Machine Learning on the Edge [69.1412199244903]
We present SensiX, a personal edge platform that stays between sensor data and sensing models.
We demonstrate its efficacy in developing motion and audio-based multi-device sensing systems.
Our evaluation shows that SensiX offers a 7-13% increase in overall accuracy and up to 30% increase across different environment dynamics at the expense of 3mW power overhead.
arXiv Detail & Related papers (2020-12-04T23:06:56Z) - Human Activity Recognition using Inertial, Physiological and
Environmental Sensors: a Comprehensive Survey [3.1166345853612296]
This survey focuses on critical role of machine learning in developing HAR applications based on inertial sensors in conjunction with physiological and environmental sensors.
Har is considered as one of the most promising assistive technology tools to support elderly's daily life by monitoring their cognitive and physical function through daily activities.
arXiv Detail & Related papers (2020-04-19T11:32:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.