Classifying Subjective Time Perception in a Multi-robot Control Scenario Using Eye-tracking Information
- URL: http://arxiv.org/abs/2504.06442v1
- Date: Tue, 08 Apr 2025 21:30:18 GMT
- Title: Classifying Subjective Time Perception in a Multi-robot Control Scenario Using Eye-tracking Information
- Authors: Till Aust, Julian Kaduk, Heiko Hamann,
- Abstract summary: Accurately assessing an operator's mental state is critical for maintaining performance and well-being.<n>We use subjective time perception as a sensitive, low-latency indicator of well-being and cognitive strain.<n>We study how human physiological signals can be used to estimate a person's subjective time perception in a human-swarm interaction scenario.
- Score: 3.8916312075738273
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As automation and mobile robotics reshape work environments, rising expectations for productivity increase cognitive demands on human operators, leading to potential stress and cognitive overload. Accurately assessing an operator's mental state is critical for maintaining performance and well-being. We use subjective time perception, which can be altered by stress and cognitive load, as a sensitive, low-latency indicator of well-being and cognitive strain. Distortions in time perception can affect decision-making, reaction times, and overall task effectiveness, making it a valuable metric for adaptive human-swarm interaction systems. We study how human physiological signals can be used to estimate a person's subjective time perception in a human-swarm interaction scenario as example. A human operator needs to guide and control a swarm of small mobile robots. We obtain eye-tracking data that is classified for subjective time perception based on questionnaire data. Our results show that we successfully estimate a person's time perception from eye-tracking data. The approach can profit from individual-based pretraining using only 30 seconds of data. In future work, we aim for robots that respond to human operator needs by automatically classifying physiological data in a closed control loop.
Related papers
- Auto Detecting Cognitive Events Using Machine Learning on Pupillary Data [0.0]
Pupil size is a valuable indicator of cognitive workload, reflecting changes in attention and arousal governed by the autonomic nervous system.
This study explores the potential of using machine learning to automatically detect cognitive events experienced using individuals.
arXiv Detail & Related papers (2024-10-18T04:54:46Z) - Automatic Classification of Subjective Time Perception Using Multi-modal Physiological Data of Air Traffic Controllers [3.7423614135604093]
We aim to develop a device that modulates human subjective time perception.
In this study, we present a method to automatically assess the subjective time perception of air traffic controllers.
arXiv Detail & Related papers (2024-03-28T10:15:10Z) - Improving Visual Perception of a Social Robot for Controlled and
In-the-wild Human-robot Interaction [10.260966795508569]
It is unclear how will the objective interaction performance and subjective user experience be influenced when a social robot adopts a deep-learning based visual perception model.
We employ state-of-the-art human perception and tracking models to improve the visual perception function of the Pepper robot.
arXiv Detail & Related papers (2024-03-04T06:47:06Z) - Real-time Addressee Estimation: Deployment of a Deep-Learning Model on
the iCub Robot [52.277579221741746]
Addressee Estimation is a skill essential for social robots to interact smoothly with humans.
Inspired by human perceptual skills, a deep-learning model for Addressee Estimation is designed, trained, and deployed on an iCub robot.
The study presents the procedure of such implementation and the performance of the model deployed in real-time human-robot interaction.
arXiv Detail & Related papers (2023-11-09T13:01:21Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - AAAI SSS-22 Symposium on Closing the Assessment Loop: Communicating
Proficiency and Intent in Human-Robot Teaming [4.787322716745613]
How should a robot convey predicted ability on a new task?
How should a robot adapt its proficiency criteria based on human intentions and values?
There are no agreed upon standards for evaluating proficiency and intent-based interactions.
arXiv Detail & Related papers (2022-04-05T18:28:01Z) - The world seems different in a social context: a neural network analysis
of human experimental data [57.729312306803955]
We show that it is possible to replicate human behavioral data in both individual and social task settings by modifying the precision of prior and sensory signals.
An analysis of the neural activation traces of the trained networks provides evidence that information is coded in fundamentally different ways in the network in the individual and in the social conditions.
arXiv Detail & Related papers (2022-03-03T17:19:12Z) - Cognitive architecture aided by working-memory for self-supervised
multi-modal humans recognition [54.749127627191655]
The ability to recognize human partners is an important social skill to build personalized and long-term human-robot interactions.
Deep learning networks have achieved state-of-the-art results and demonstrated to be suitable tools to address such a task.
One solution is to make robots learn from their first-hand sensory data with self-supervision.
arXiv Detail & Related papers (2021-03-16T13:50:24Z) - Careful with That! Observation of Human Movements to Estimate Objects
Properties [106.925705883949]
We focus on the features of human motor actions that communicate insights on the weight of an object.
Our final goal is to enable a robot to autonomously infer the degree of care required in object handling.
arXiv Detail & Related papers (2021-03-02T08:14:56Z) - AGENT: A Benchmark for Core Psychological Reasoning [60.35621718321559]
Intuitive psychology is the ability to reason about hidden mental variables that drive observable actions.
Despite recent interest in machine agents that reason about other agents, it is not clear if such agents learn or hold the core psychology principles that drive human reasoning.
We present a benchmark consisting of procedurally generated 3D animations, AGENT, structured around four scenarios.
arXiv Detail & Related papers (2021-02-24T14:58:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.