Your Day in Your Pocket: Complex Activity Recognition from Smartphone
Accelerometers
- URL: http://arxiv.org/abs/2301.06993v1
- Date: Tue, 17 Jan 2023 16:22:30 GMT
- Title: Your Day in Your Pocket: Complex Activity Recognition from Smartphone
Accelerometers
- Authors: Emma Bouton--Bessac, Lakmal Meegahapola, Daniel Gatica-Perez
- Abstract summary: This paper investigates the recognition of complex activities exclusively using smartphone accelerometer data.
We used a large smartphone sensing dataset collected from over 600 users in five countries during the pandemic.
Deep learning-based, binary classification of eight complex activities can be achieved with AUROC scores up to 0.76 with partially personalized models.
- Score: 7.335712499936904
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Human Activity Recognition (HAR) enables context-aware user experiences where
mobile apps can alter content and interactions depending on user activities.
Hence, smartphones have become valuable for HAR as they allow large, and
diversified data collection. Although previous work in HAR managed to detect
simple activities (i.e., sitting, walking, running) with good accuracy using
inertial sensors (i.e., accelerometer), the recognition of complex daily
activities remains an open problem, specially in remote work/study settings
when people are more sedentary. Moreover, understanding the everyday activities
of a person can support the creation of applications that aim to support their
well-being. This paper investigates the recognition of complex activities
exclusively using smartphone accelerometer data. We used a large smartphone
sensing dataset collected from over 600 users in five countries during the
pandemic and showed that deep learning-based, binary classification of eight
complex activities (sleeping, eating, watching videos, online communication,
attending a lecture, sports, shopping, studying) can be achieved with AUROC
scores up to 0.76 with partially personalized models. This shows encouraging
signs toward assessing complex activities only using phone accelerometer data
in the post-pandemic world.
Related papers
- Complex Daily Activities, Country-Level Diversity, and Smartphone
Sensing: A Study in Denmark, Italy, Mongolia, Paraguay, and UK [6.52702503779308]
Smartphones enable understanding human behavior with activity recognition to support people's daily lives.
People are more sedentary in the post-pandemic world with the prevalence of remote/hybrid work/study settings.
We analyzed in-the-wild smartphone data and over 216K self-reports from 637 college students in five countries.
arXiv Detail & Related papers (2023-02-16T21:34:55Z) - Towards Continual Egocentric Activity Recognition: A Multi-modal
Egocentric Activity Dataset for Continual Learning [21.68009790164824]
We present a multi-modal egocentric activity dataset for continual learning named UESTC-MMEA-CL.
It contains synchronized data of videos, accelerometers, and gyroscopes, for 32 types of daily activities, performed by 10 participants.
Results of egocentric activity recognition are reported when using separately, and jointly, three modalities: RGB, acceleration, and gyroscope.
arXiv Detail & Related papers (2023-01-26T04:32:00Z) - FLAG3D: A 3D Fitness Activity Dataset with Language Instruction [89.60371681477791]
We present FLAG3D, a large-scale 3D fitness activity dataset with language instruction containing 180K sequences of 60 categories.
We show that FLAG3D contributes great research value for various challenges, such as cross-domain human action recognition, dynamic human mesh recovery, and language-guided human action generation.
arXiv Detail & Related papers (2022-12-09T02:33:33Z) - Information We Can Extract About a User From 'One Minute Mobile
Application Usage' [0.0]
In this paper, we extracted different human activities using accelerometer, magnetometer, and gyroscope sensors of android smartphones.
Using different social media applications, such as Facebook, Instagram, Whatsapp, and Twitter, we extracted the raw sensor values along with the attributes of $29$ subjects.
We extract features from the raw signals and use them to perform classification using different machine learning (ML) algorithms.
arXiv Detail & Related papers (2022-07-27T00:23:11Z) - Human Activity Recognition on Time Series Accelerometer Sensor Data
using LSTM Recurrent Neural Networks [0.2294014185517203]
In this study, we focus on the use of smartwatch accelerometer sensors to recognize eating activity.
We collected sensor data from 10 participants while consuming pizza.
We developed a LSTM-ANN architecture that has demonstrated 90% success in identifying individual bites compared to a puff, medication-taking or jogging activities.
arXiv Detail & Related papers (2022-06-03T19:24:20Z) - Mobile Behavioral Biometrics for Passive Authentication [65.94403066225384]
This work carries out a comparative analysis of unimodal and multimodal behavioral biometric traits.
Experiments are performed over HuMIdb, one of the largest and most comprehensive freely available mobile user interaction databases.
In our experiments, the most discriminative background sensor is the magnetometer, whereas among touch tasks the best results are achieved with keystroke.
arXiv Detail & Related papers (2022-03-14T17:05:59Z) - Physical Activity Recognition by Utilising Smartphone Sensor Signals [0.0]
This study collected human activity data from 60 participants across two different days for a total of six activities recorded by gyroscope and accelerometer sensors in a modern smartphone.
The proposed approach achieved a classification accuracy of 98 percent in identifying four different activities.
arXiv Detail & Related papers (2022-01-20T09:58:52Z) - The Second-Level Smartphone Divide: A Typology of Smartphone Usage Based
on Frequency of Use, Skills, and Types of Activities [77.34726150561087]
This paper examines inequalities in the usage of smartphone technology based on five samples of smartphone owners collected in Germany and Austria between 2016 and 2020.
We identify six distinct types of smartphone users by conducting latent class analyses that classify individuals based on their frequency of smartphone use, self-rated smartphone skills, and activities carried out on their smartphone.
arXiv Detail & Related papers (2021-11-09T13:38:59Z) - Tutorial on Deep Learning for Human Activity Recognition [70.94062293989832]
This tutorial was first held at the 2021 ACM International Symposium on Wearable Computers (ISWC'21)
It provides a hands-on and interactive walk-through of the most important steps in the data pipeline for the deep learning of human activities.
arXiv Detail & Related papers (2021-10-13T12:01:02Z) - TapNet: The Design, Training, Implementation, and Applications of a
Multi-Task Learning CNN for Off-Screen Mobile Input [75.05709030478073]
We present the design, training, implementation and applications of TapNet, a multi-task network that detects tapping on the smartphone.
TapNet can jointly learn from data across devices and simultaneously recognize multiple tap properties, including tap direction and tap location.
arXiv Detail & Related papers (2021-02-18T00:45:41Z) - What Can You Learn from Your Muscles? Learning Visual Representation
from Human Interactions [50.435861435121915]
We use human interaction and attention cues to investigate whether we can learn better representations compared to visual-only representations.
Our experiments show that our "muscly-supervised" representation outperforms a visual-only state-of-the-art method MoCo.
arXiv Detail & Related papers (2020-10-16T17:46:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.