PainSeeker: An Automated Method for Assessing Pain in Rats Through
Facial Expressions
- URL: http://arxiv.org/abs/2311.03205v1
- Date: Mon, 6 Nov 2023 15:49:11 GMT
- Title: PainSeeker: An Automated Method for Assessing Pain in Rats Through
Facial Expressions
- Authors: Liu Liu, Guang Li, Dingfan Deng, Jinhua Yu, Yuan Zong
- Abstract summary: We present a dataset called RatsPain consisting of 1,138 facial images captured from six rats that underwent an orthodontic treatment operation.
We then proposed a novel deep learning method called PainSeeker for automatically assessing pain in rats via facial expressions.
PainSeeker aims to seek pain-related facial local regions that facilitate learning both pain discriminative and head pose robust features from facial expression images.
- Score: 14.003480681631226
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this letter, we aim to investigate whether laboratory rats' pain can be
automatically assessed through their facial expressions. To this end, we began
by presenting a publicly available dataset called RatsPain, consisting of 1,138
facial images captured from six rats that underwent an orthodontic treatment
operation. Each rat' facial images in RatsPain were carefully selected from
videos recorded either before or after the operation and well labeled by eight
annotators according to the Rat Grimace Scale (RGS). We then proposed a novel
deep learning method called PainSeeker for automatically assessing pain in rats
via facial expressions. PainSeeker aims to seek pain-related facial local
regions that facilitate learning both pain discriminative and head pose robust
features from facial expression images. To evaluate the PainSeeker, we
conducted extensive experiments on the RatsPain dataset. The results
demonstrate the feasibility of assessing rats' pain from their facial
expressions and also verify the effectiveness of the proposed PainSeeker in
addressing this emerging but intriguing problem. The RasPain dataset can be
freely obtained from https://github.com/xhzongyuan/RatsPain.
Related papers
- Automated facial recognition system using deep learning for pain
assessment in adults with cerebral palsy [0.5242869847419834]
Existing measures, relying on direct observation by caregivers, lack sensitivity and specificity.
Ten neural networks were trained on three pain image databases.
InceptionV3 exhibited promising performance on the CP-PAIN dataset.
arXiv Detail & Related papers (2024-01-22T17:55:16Z) - A Generative Approach for Image Registration of Visible-Thermal (VT)
Cancer Faces [77.77475333490744]
We modernize the classic computer vision task of image registration by applying and modifying a generative alignment algorithm.
We demonstrate that the quality of thermal images produced in the generative AI downstream task of Visible-to-Thermal (V2T) image translation significantly improves up to 52.5%.
arXiv Detail & Related papers (2023-08-23T17:39:58Z) - Pain Detection in Masked Faces during Procedural Sedation [0.0]
Pain monitoring is essential to the quality of care for patients undergoing a medical procedure with sedation.
Previous studies have shown the viability of computer vision methods in detecting pain in unoccluded faces.
This study has collected video data from masked faces of 14 patients undergoing procedures in an interventional radiology department.
arXiv Detail & Related papers (2022-11-12T15:55:33Z) - Video-based estimation of pain indicators in dogs [2.7103996289794217]
We propose a novel video-based, two-stream deep neural network approach for this problem.
We extract and preprocess body keypoints, and compute features from both keypoints and the RGB representation over the video.
We present a unique video-based dog behavior dataset, collected by veterinary professionals, and annotated for presence of pain.
arXiv Detail & Related papers (2022-09-27T10:38:59Z) - Intelligent Sight and Sound: A Chronic Cancer Pain Dataset [74.77784420691937]
This paper introduces the first chronic cancer pain dataset, collected as part of the Intelligent Sight and Sound (ISS) clinical trial.
The data collected to date consists of 29 patients, 509 smartphone videos, 189,999 frames, and self-reported affective and activity pain scores.
Using static images and multi-modal data to predict self-reported pain levels, early models show significant gaps between current methods available to predict pain.
arXiv Detail & Related papers (2022-04-07T22:14:37Z) - Sharing Pain: Using Domain Transfer Between Pain Types for Recognition
of Sparse Pain Expressions in Horses [1.749935196721634]
Orthopedic disorders are a common cause for euthanasia among horses.
It is challenging to train a visual pain recognition method with video data depicting such pain.
We show that transferring features from a dataset of horses with acute nociceptive pain can aid the learning to recognize more complex orthopedic pain.
arXiv Detail & Related papers (2021-05-21T12:35:00Z) - Non-contact Pain Recognition from Video Sequences with Remote
Physiological Measurements Prediction [53.03469655641418]
We present a novel multi-task learning framework which encodes both appearance changes and physiological cues in a non-contact manner for pain recognition.
We establish the state-of-the-art performance of non-contact pain recognition on publicly available pain databases.
arXiv Detail & Related papers (2021-05-18T20:47:45Z) - Facial Expressions as a Vulnerability in Face Recognition [73.85525896663371]
This work explores facial expression bias as a security vulnerability of face recognition systems.
We present a comprehensive analysis of how facial expression bias impacts the performance of face recognition technologies.
arXiv Detail & Related papers (2020-11-17T18:12:41Z) - From A Glance to "Gotcha": Interactive Facial Image Retrieval with
Progressive Relevance Feedback [72.29919762941029]
We propose an end-to-end framework to retrieve facial images with relevance feedback progressively provided by the witness.
With no need of any extra annotations, our model can be applied at the cost of a little response effort.
arXiv Detail & Related papers (2020-07-30T18:46:25Z) - Pain Intensity Estimation from Mobile Video Using 2D and 3D Facial
Keypoints [1.6402428190800593]
Managing post-surgical pain is critical for successful surgical outcomes.
One of the challenges of pain management is accurately assessing the pain level of patients.
We introduce an approach that analyzes 2D and 3D facial keypoints of post-surgical patients to estimate their pain intensity level.
arXiv Detail & Related papers (2020-06-17T00:18:29Z) - It's Written All Over Your Face: Full-Face Appearance-Based Gaze
Estimation [82.16380486281108]
We propose an appearance-based method that only takes the full face image as input.
Our method encodes the face image using a convolutional neural network with spatial weights applied on the feature maps.
We show that our full-face method significantly outperforms the state of the art for both 2D and 3D gaze estimation.
arXiv Detail & Related papers (2016-11-27T15:00:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.