End-to-End Machine Learning Framework for Facial AU Detection in
Intensive Care Units
- URL: http://arxiv.org/abs/2211.06570v1
- Date: Sat, 12 Nov 2022 04:43:16 GMT
- Title: End-to-End Machine Learning Framework for Facial AU Detection in
Intensive Care Units
- Authors: Subhash Nerella, Kia Khezeli, Andrea Davidson, Patrick Tighe, Azra
Bihorac, Parisa Rashidi
- Abstract summary: Pain is a common occurrence among patients admitted to Intensive Care Units.
Current manual observation-based pain assessment tools are limited by the frequency of pain observations administered and are subjective to the observer.
We present our Pain-ICU dataset, the largest dataset available targeting facial behavior analysis in the dynamic ICU environment.
- Score: 3.8168092489216385
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Pain is a common occurrence among patients admitted to Intensive Care Units.
Pain assessment in ICU patients still remains a challenge for clinicians and
ICU staff, specifically in cases of non-verbal sedated, mechanically
ventilated, and intubated patients. Current manual observation-based pain
assessment tools are limited by the frequency of pain observations administered
and are subjective to the observer. Facial behavior is a major component in
observation-based tools. Furthermore, previous literature shows the feasibility
of painful facial expression detection using facial action units (AUs).
However, these approaches are limited to controlled or semi-controlled
environments and have never been validated in clinical settings. In this study,
we present our Pain-ICU dataset, the largest dataset available targeting facial
behavior analysis in the dynamic ICU environment. Our dataset comprises 76,388
patient facial image frames annotated with AUs obtained from 49 adult patients
admitted to ICUs at the University of Florida Health Shands hospital. In this
work, we evaluated two vision transformer models, namely ViT and SWIN, for AU
detection on our Pain-ICU dataset and also external datasets. We developed a
completely end-to-end AU detection pipeline with the objective of performing
real-time AU detection in the ICU. The SWIN transformer Base variant achieved
0.88 F1-score and 0.85 accuracy on the held-out test partition of the Pain-ICU
dataset.
Related papers
- Leveraging Computer Vision in the Intensive Care Unit (ICU) for Examining Visitation and Mobility [12.347067736902094]
We leverage a state-of-the-art noninvasive computer vision system based on depth imaging to characterize ICU visitations and patients' mobility.
We found an association between deteriorating patient acuity and the incidence of delirium with increased visitations.
Our findings highlight the feasibility and potential of using noninvasive autonomous systems to monitor ICU patients.
arXiv Detail & Related papers (2024-03-10T21:43:47Z) - Detecting Visual Cues in the Intensive Care Unit and Association with Patient Clinical Status [0.9867627975175174]
Existing patient assessments in the ICU are mostly sporadic and administered manually.
We developed a new "masked loss computation" technique that addresses the data imbalance problem.
We performed AU inference on 634,054 frames to evaluate the association between facial AUs and clinically important patient conditions.
arXiv Detail & Related papers (2023-11-01T15:07:03Z) - AI-Enhanced Intensive Care Unit: Revolutionizing Patient Care with Pervasive Sensing [2.8688584757794064]
The intensive care unit (ICU) is a specialized hospital space where critically ill patients receive intensive care and monitoring.
Comprehensive monitoring is imperative in assessing patients conditions, in particular acuity, and ultimately the quality of care.
Currently, visual assessments for acuity, including fine details such as facial expressions, posture, and mobility, are sporadically captured, or not captured at all.
arXiv Detail & Related papers (2023-03-11T00:25:55Z) - Pain level and pain-related behaviour classification using GRU-based
sparsely-connected RNNs [61.080598804629375]
People with chronic pain unconsciously adapt specific body movements to protect themselves from injury or additional pain.
Because there is no dedicated benchmark database to analyse this correlation, we considered one of the specific circumstances that potentially influence a person's biometrics during daily activities.
We proposed a sparsely-connected recurrent neural networks (s-RNNs) ensemble with the gated recurrent unit (GRU) that incorporates multiple autoencoders.
We conducted several experiments which indicate that the proposed method outperforms the state-of-the-art approaches in classifying both pain level and pain-related behaviour.
arXiv Detail & Related papers (2022-12-20T12:56:28Z) - Pain Detection in Masked Faces during Procedural Sedation [0.0]
Pain monitoring is essential to the quality of care for patients undergoing a medical procedure with sedation.
Previous studies have shown the viability of computer vision methods in detecting pain in unoccluded faces.
This study has collected video data from masked faces of 14 patients undergoing procedures in an interventional radiology department.
arXiv Detail & Related papers (2022-11-12T15:55:33Z) - Dissecting Self-Supervised Learning Methods for Surgical Computer Vision [51.370873913181605]
Self-Supervised Learning (SSL) methods have begun to gain traction in the general computer vision community.
The effectiveness of SSL methods in more complex and impactful domains, such as medicine and surgery, remains limited and unexplored.
We present an extensive analysis of the performance of these methods on the Cholec80 dataset for two fundamental and popular tasks in surgical context understanding, phase recognition and tool presence detection.
arXiv Detail & Related papers (2022-07-01T14:17:11Z) - Intelligent Sight and Sound: A Chronic Cancer Pain Dataset [74.77784420691937]
This paper introduces the first chronic cancer pain dataset, collected as part of the Intelligent Sight and Sound (ISS) clinical trial.
The data collected to date consists of 29 patients, 509 smartphone videos, 189,999 frames, and self-reported affective and activity pain scores.
Using static images and multi-modal data to predict self-reported pain levels, early models show significant gaps between current methods available to predict pain.
arXiv Detail & Related papers (2022-04-07T22:14:37Z) - Predicting Patient Readmission Risk from Medical Text via Knowledge
Graph Enhanced Multiview Graph Convolution [67.72545656557858]
We propose a new method that uses medical text of Electronic Health Records for prediction.
We represent discharge summaries of patients with multiview graphs enhanced by an external knowledge graph.
Experimental results prove the effectiveness of our method, yielding state-of-the-art performance.
arXiv Detail & Related papers (2021-12-19T01:45:57Z) - COVIDNet-CT: A Tailored Deep Convolutional Neural Network Design for
Detection of COVID-19 Cases from Chest CT Images [75.74756992992147]
We introduce COVIDNet-CT, a deep convolutional neural network architecture that is tailored for detection of COVID-19 cases from chest CT images.
We also introduce COVIDx-CT, a benchmark CT image dataset derived from CT imaging data collected by the China National Center for Bioinformation.
arXiv Detail & Related papers (2020-09-08T15:49:55Z) - Integrative Analysis for COVID-19 Patient Outcome Prediction [53.11258640541513]
We combine radiomics of lung opacities and non-imaging features from demographic data, vital signs, and laboratory findings to predict need for intensive care unit admission.
Our methods may also be applied to other lung diseases including but not limited to community acquired pneumonia.
arXiv Detail & Related papers (2020-07-20T19:08:50Z) - Facial Action Unit Detection on ICU Data for Pain Assessment [1.8352113484137622]
Current day pain assessment methods rely on patient self-report or by an observer like the Intensive Care Unit (ICU) nurses.
In this study, we show the need for automated pain assessment system which is trained on real-world ICU data for clinically acceptable pain assessment system.
arXiv Detail & Related papers (2020-04-24T17:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.