Evaluating the Possibility of Integrating Augmented Reality and Internet
of Things Technologies to Help Patients with Alzheimer's Disease
- URL: http://arxiv.org/abs/2301.08795v1
- Date: Fri, 20 Jan 2023 20:39:32 GMT
- Title: Evaluating the Possibility of Integrating Augmented Reality and Internet
of Things Technologies to Help Patients with Alzheimer's Disease
- Authors: Fatemeh Ghorbani, Mohammad Kia, Mehdi Delrobaei, Quazi Rahman
- Abstract summary: This study reports preliminary results on an Ambient Assisted Living (AAL) real-time system, achieved through the Internet of Things (IoT) and Augmented Reality (AR) concepts.
The system has two main sections: the smartphone or windows application allows caregivers to monitor patients' status at home and be notified if patients are at risk.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: People suffering from Alzheimer's disease (AD) and their caregivers seek
different approaches to cope with memory loss. Although AD patients want to
live independently, they often need help from caregivers. In this situation,
caregivers may attach notes on every single object or take out the contents of
a drawer to make them visible before leaving the patient alone at home. This
study reports preliminary results on an Ambient Assisted Living (AAL) real-time
system, achieved through the Internet of Things (IoT) and Augmented Reality
(AR) concepts, aimed at helping people suffering from AD. The system has two
main sections: the smartphone or windows application allows caregivers to
monitor patients' status at home and be notified if patients are at risk. The
second part allows patients to use smart glasses to recognize QR codes in the
environment and receive information related to tags in the form of audio, text,
or three-dimensional image. This work presents preliminary results and
investigates the possibility of implementing such a system.
Related papers
- The doctor will polygraph you now: ethical concerns with AI for fact-checking patients [0.0]
Clinical artificial intelligence (AI) methods have been proposed for predicting social behaviors which could be reasonably understood from patient-reported data.
This raises ethical concerns about respect, privacy, and patient awareness/control over how their health data is used.
arXiv Detail & Related papers (2024-08-15T02:55:30Z) - Analyzing Participants' Engagement during Online Meetings Using Unsupervised Remote Photoplethysmography with Behavioral Features [50.82725748981231]
Engagement measurement finds application in healthcare, education, services.
Use of physiological and behavioral features is viable, but impracticality of traditional physiological measurement arises due to the need for contact sensors.
We demonstrate the feasibility of the unsupervised photoplethysmography (rmography) as an alternative for contact sensors.
arXiv Detail & Related papers (2024-04-05T20:39:16Z) - MemoryCompanion: A Smart Healthcare Solution to Empower Efficient
Alzheimer's Care Via Unleashing Generative AI [8.741075482543991]
This paper unveils MemoryCompanion', a pioneering digital health solution specifically tailored for Alzheimer's disease (AD) patients and their caregivers.
MemoryCompanion manifests a personalized caregiving paradigm, fostering interactions via voice-cloning and talking-face mechanisms.
Our methodology, grounded in its innovative design, addresses both the caregiving and technological challenges intrinsic to this domain.
arXiv Detail & Related papers (2023-11-20T19:41:50Z) - AI enabled RPM for Mental Health Facility [8.26802516741755]
This paper discusses an AI-enabled RPM system framework with a non-invasive digital technology RFID using its in-built NCS mechanism to retrieve vital signs and physical actions of patients.
Based on the retrieved time series data, future vital signs of patients for the upcoming 3 hours and classify their physical actions into 10 labelled physical activities.
This framework assists to avoid any unforeseen clinical disasters and take precautionary measures with medical intervention at right time.
arXiv Detail & Related papers (2023-01-20T23:47:16Z) - What Do End-Users Really Want? Investigation of Human-Centered XAI for
Mobile Health Apps [69.53730499849023]
We present a user-centered persona concept to evaluate explainable AI (XAI)
Results show that users' demographics and personality, as well as the type of explanation, impact explanation preferences.
Our insights bring an interactive, human-centered XAI closer to practical application.
arXiv Detail & Related papers (2022-10-07T12:51:27Z) - Developing Medical AI : a cloud-native audio-visual data collection
study [0.0]
This paper describes a protocol for audio-visual data collection study, a cloud-architecture for efficiently processing and consuming such data, and the design of a specific data collection device.
The goal of this paper is to improve early identification of deteriorating patients in the hospital.
arXiv Detail & Related papers (2021-08-17T18:01:12Z) - MIMO: Mutual Integration of Patient Journey and Medical Ontology for
Healthcare Representation Learning [49.57261599776167]
We propose an end-to-end robust Transformer-based solution, Mutual Integration of patient journey and Medical Ontology (MIMO) for healthcare representation learning and predictive analytics.
arXiv Detail & Related papers (2021-07-20T07:04:52Z) - MET: Multimodal Perception of Engagement for Telehealth [52.54282887530756]
We present MET, a learning-based algorithm for perceiving a human's level of engagement from videos.
We release a new dataset, MEDICA, for mental health patient engagement detection.
arXiv Detail & Related papers (2020-11-17T15:18:38Z) - AEGIS: A real-time multimodal augmented reality computer vision based
system to assist facial expression recognition for individuals with autism
spectrum disorder [93.0013343535411]
This paper presents the development of a multimodal augmented reality (AR) system which combines the use of computer vision and deep convolutional neural networks (CNN)
The proposed system, which we call AEGIS, is an assistive technology deployable on a variety of user devices including tablets, smartphones, video conference systems, or smartglasses.
We leverage both spatial and temporal information in order to provide an accurate expression prediction, which is then converted into its corresponding visualization and drawn on top of the original video frame.
arXiv Detail & Related papers (2020-10-22T17:20:38Z) - BiteNet: Bidirectional Temporal Encoder Network to Predict Medical
Outcomes [53.163089893876645]
We propose a novel self-attention mechanism that captures the contextual dependency and temporal relationships within a patient's healthcare journey.
An end-to-end bidirectional temporal encoder network (BiteNet) then learns representations of the patient's journeys.
We have evaluated the effectiveness of our methods on two supervised prediction and two unsupervised clustering tasks with a real-world EHR dataset.
arXiv Detail & Related papers (2020-09-24T00:42:36Z) - AutoCogniSys: IoT Assisted Context-Aware Automatic Cognitive Health
Assessment [2.7998963147546148]
AutoCogniSys is a context-aware automated cognitive health assessment system.
We develop an automatic cognitive health assessment system in a natural older adults living environment.
The performance of AutoCogniSys attests max. 93% of accuracy in assessing cognitive health of older adults.
arXiv Detail & Related papers (2020-03-17T01:44:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.