Human Activity Recognition using Smartphones
- URL: http://arxiv.org/abs/2404.02869v1
- Date: Wed, 3 Apr 2024 17:05:41 GMT
- Title: Human Activity Recognition using Smartphones
- Authors: Mayur Sonawane, Sahil Rajesh Dhayalkar, Siddesh Waje, Soyal Markhelkar, Akshay Wattamwar, Seema C. Shrawne,
- Abstract summary: We have created an Android application that recognizes the daily human activities and calculate the calories burnt in real time.
This is used for real-time activity recognition and calculation of calories burnt using a formula based on Metabolic Equivalent.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Human Activity Recognition is a subject of great research today and has its applications in remote healthcare, activity tracking of the elderly or the disables, calories burnt tracking etc. In our project, we have created an Android application that recognizes the daily human activities and calculate the calories burnt in real time. We first captured labeled triaxial acceleration readings for different daily human activities from the smartphone's embedded accelerometer. These readings were preprocessed using a median filter. 42 features were extracted using various methods. We then tested various machine learning algorithms along with dimensionality reduction. Finally, in our Android application, we used the machine learning algorithm and a subset of features that provided maximum accuracy and minimum model building time. This is used for real-time activity recognition and calculation of calories burnt using a formula based on Metabolic Equivalent.
Related papers
- Multi-Channel Time-Series Person and Soft-Biometric Identification [65.83256210066787]
This work investigates person and soft-biometrics identification from recordings of humans performing different activities using deep architectures.
We evaluate the method on four datasets of multi-channel time-series human activity recognition (HAR)
Soft-biometric based attribute representation shows promising results and emphasis the necessity of larger datasets.
arXiv Detail & Related papers (2023-04-04T07:24:51Z) - RMBench: Benchmarking Deep Reinforcement Learning for Robotic
Manipulator Control [47.61691569074207]
Reinforcement learning is applied to solve actual complex tasks from high-dimensional, sensory inputs.
Recent progress benefits from deep learning for raw sensory signal representation.
We present RMBench, the first benchmark for robotic manipulations.
arXiv Detail & Related papers (2022-10-20T13:34:26Z) - Information We Can Extract About a User From 'One Minute Mobile
Application Usage' [0.0]
In this paper, we extracted different human activities using accelerometer, magnetometer, and gyroscope sensors of android smartphones.
Using different social media applications, such as Facebook, Instagram, Whatsapp, and Twitter, we extracted the raw sensor values along with the attributes of $29$ subjects.
We extract features from the raw signals and use them to perform classification using different machine learning (ML) algorithms.
arXiv Detail & Related papers (2022-07-27T00:23:11Z) - Classifying Human Activities using Machine Learning and Deep Learning
Techniques [0.0]
Human Activity Recognition (HAR) describes the machines ability to recognize human actions.
Challenge in HAR is to overcome the difficulties of separating human activities based on the given data.
Deep Learning techniques like Long Short-Term Memory (LSTM), Bi-Directional LS classifier, Recurrent Neural Network (RNN), and Gated Recurrent Unit (GRU) are trained.
Experiment results proved that the Linear Support Vector in machine learning and Gated Recurrent Unit in Deep Learning provided better accuracy for human activity recognition.
arXiv Detail & Related papers (2022-05-19T05:20:04Z) - Mobile Behavioral Biometrics for Passive Authentication [65.94403066225384]
This work carries out a comparative analysis of unimodal and multimodal behavioral biometric traits.
Experiments are performed over HuMIdb, one of the largest and most comprehensive freely available mobile user interaction databases.
In our experiments, the most discriminative background sensor is the magnetometer, whereas among touch tasks the best results are achieved with keystroke.
arXiv Detail & Related papers (2022-03-14T17:05:59Z) - HAR-GCNN: Deep Graph CNNs for Human Activity Recognition From Highly
Unlabeled Mobile Sensor Data [61.79595926825511]
Acquiring balanced datasets containing accurate activity labels requires humans to correctly annotate and potentially interfere with the subjects' normal activities in real-time.
We propose HAR-GCCN, a deep graph CNN model that leverages the correlation between chronologically adjacent sensor measurements to predict the correct labels for unclassified activities.
Har-GCCN shows superior performance relative to previously used baseline methods, improving classification accuracy by about 25% and up to 68% on different datasets.
arXiv Detail & Related papers (2022-03-07T01:23:46Z) - HAKE: A Knowledge Engine Foundation for Human Activity Understanding [65.24064718649046]
Human activity understanding is of widespread interest in artificial intelligence and spans diverse applications like health care and behavior analysis.
We propose a novel paradigm to reformulate this task in two stages: first mapping pixels to an intermediate space spanned by atomic activity primitives, then programming detected primitives with interpretable logic rules to infer semantics.
Our framework, the Human Activity Knowledge Engine (HAKE), exhibits superior generalization ability and performance upon challenging benchmarks.
arXiv Detail & Related papers (2022-02-14T16:38:31Z) - Human Activity Recognition models using Limited Consumer Device Sensors
and Machine Learning [0.0]
Human activity recognition has grown in popularity with its increase of applications within daily lifestyles and medical environments.
This paper presents the findings of different models that are limited to train using sensor data from smartphones and smartwatches.
Results show promise for models trained strictly using limited sensor data collected from only smartphones and smartwatches coupled with traditional machine learning concepts and algorithms.
arXiv Detail & Related papers (2022-01-21T06:54:05Z) - Physical Activity Recognition by Utilising Smartphone Sensor Signals [0.0]
This study collected human activity data from 60 participants across two different days for a total of six activities recorded by gyroscope and accelerometer sensors in a modern smartphone.
The proposed approach achieved a classification accuracy of 98 percent in identifying four different activities.
arXiv Detail & Related papers (2022-01-20T09:58:52Z) - Incremental Learning Techniques for Online Human Activity Recognition [0.0]
We propose a human activity recognition (HAR) approach for the online prediction of physical movements.
We develop a HAR system containing monitoring software and a mobile application that collects accelerometer and gyroscope data.
Six incremental learning algorithms are employed and evaluated in this work and compared with several batch learning algorithms commonly used for developing offline HAR systems.
arXiv Detail & Related papers (2021-09-20T11:33:09Z) - ZSTAD: Zero-Shot Temporal Activity Detection [107.63759089583382]
We propose a novel task setting called zero-shot temporal activity detection (ZSTAD), where activities that have never been seen in training can still be detected.
We design an end-to-end deep network based on R-C3D as the architecture for this solution.
Experiments on both the THUMOS14 and the Charades datasets show promising performance in terms of detecting unseen activities.
arXiv Detail & Related papers (2020-03-12T02:40:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.