Classification of Safety Driver Attention During Autonomous Vehicle
Operation
- URL: http://arxiv.org/abs/2310.11608v1
- Date: Tue, 17 Oct 2023 22:04:42 GMT
- Title: Classification of Safety Driver Attention During Autonomous Vehicle
Operation
- Authors: Santiago Gerling Konrad, Julie Stephany Berrio, Mao Shan, Favio Masson
and Stewart Worrall
- Abstract summary: This paper introduces a dual-source approach integrating data from an infrared camera facing the vehicle operator and vehicle perception systems.
The proposed system effectively determines a metric for the attention levels of the vehicle operator, enabling interventions such as warnings or reducing autonomous functionality as appropriate.
- Score: 11.33083039877258
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Despite the continual advances in Advanced Driver Assistance Systems (ADAS)
and the development of high-level autonomous vehicles (AV), there is a general
consensus that for the short to medium term, there is a requirement for a human
supervisor to handle the edge cases that inevitably arise. Given this
requirement, it is essential that the state of the vehicle operator is
monitored to ensure they are contributing to the vehicle's safe operation. This
paper introduces a dual-source approach integrating data from an infrared
camera facing the vehicle operator and vehicle perception systems to produce a
metric for driver alertness in order to promote and ensure safe operator
behaviour. The infrared camera detects the driver's head, enabling the
calculation of head orientation, which is relevant as the head typically moves
according to the individual's focus of attention. By incorporating
environmental data from the perception system, it becomes possible to determine
whether the vehicle operator observes objects in the surroundings. Experiments
were conducted using data collected in Sydney, Australia, simulating AV
operations in an urban environment. Our results demonstrate that the proposed
system effectively determines a metric for the attention levels of the vehicle
operator, enabling interventions such as warnings or reducing autonomous
functionality as appropriate. This comprehensive solution shows promise in
contributing to ADAS and AVs' overall safety and efficiency in a real-world
setting.
Related papers
- MSight: An Edge-Cloud Infrastructure-based Perception System for
Connected Automated Vehicles [58.461077944514564]
This paper presents MSight, a cutting-edge roadside perception system specifically designed for automated vehicles.
MSight offers real-time vehicle detection, localization, tracking, and short-term trajectory prediction.
Evaluations underscore the system's capability to uphold lane-level accuracy with minimal latency.
arXiv Detail & Related papers (2023-10-08T21:32:30Z) - Assessing Drivers' Situation Awareness in Semi-Autonomous Vehicles: ASP
based Characterisations of Driving Dynamics for Modelling Scene
Interpretation and Projection [0.0]
We present a framework to asses how aware the driver is about the situation and to provide human-centred assistance.
The framework is developed as a modular system within the Robot Operating System (ROS) with modules for sensing the environment and the driver state.
A particular focus of this paper is on an Answer Set Programming (ASP) based approach for modelling and reasoning about the driver's interpretation and projection of the scene.
arXiv Detail & Related papers (2023-08-30T09:07:49Z) - Visual Saliency Detection in Advanced Driver Assistance Systems [7.455416595124159]
We present an intelligent system that combines a drowsiness detection system for drivers with a scene comprehension pipeline based on saliency.
We employ an innovative biosensor embedded on the car steering wheel to monitor the driver.
A dedicated 1D temporal deep convolutional network has been devised to classify the collected PPG time-series.
arXiv Detail & Related papers (2023-07-26T15:41:54Z) - Camera-Radar Perception for Autonomous Vehicles and ADAS: Concepts,
Datasets and Metrics [77.34726150561087]
This work aims to carry out a study on the current scenario of camera and radar-based perception for ADAS and autonomous vehicles.
Concepts and characteristics related to both sensors, as well as to their fusion, are presented.
We give an overview of the Deep Learning-based detection and segmentation tasks, and the main datasets, metrics, challenges, and open questions in vehicle perception.
arXiv Detail & Related papers (2023-03-08T00:48:32Z) - Safety-aware Motion Prediction with Unseen Vehicles for Autonomous
Driving [104.32241082170044]
We study a new task, safety-aware motion prediction with unseen vehicles for autonomous driving.
Unlike the existing trajectory prediction task for seen vehicles, we aim at predicting an occupancy map.
Our approach is the first one that can predict the existence of unseen vehicles in most cases.
arXiv Detail & Related papers (2021-09-03T13:33:33Z) - The Multimodal Driver Monitoring Database: A Naturalistic Corpus to
Study Driver Attention [44.94118128276982]
A smart vehicle should be able to monitor the actions and behaviors of the human driver to provide critical warnings or intervene when necessary.
Recent advancements in deep learning and computer vision have shown great promise in monitoring human behaviors and activities.
A vast amount of in-domain data is required to train models that provide high performance in predicting driving related tasks.
arXiv Detail & Related papers (2020-12-23T16:37:17Z) - Driver Drowsiness Classification Based on Eye Blink and Head Movement
Features Using the k-NN Algorithm [8.356765961526955]
This work is to extend the driver drowsiness detection in vehicles using signals of a driver monitoring camera.
For this purpose, 35 features related to the driver's eye blinking behavior and head movements are extracted in driving simulator experiments.
A concluding analysis of the best performing feature sets yields valuable insights about the influence of drowsiness on the driver's blink behavior and head movements.
arXiv Detail & Related papers (2020-09-28T12:37:38Z) - A Survey and Tutorial of EEG-Based Brain Monitoring for Driver State
Analysis [164.93739293097605]
EEG is proven to be one of the most effective methods for driver state monitoring and human error detection.
This paper discusses EEG-based driver state detection systems and their corresponding analysis algorithms over the last three decades.
It is concluded that the current EEG-based driver state monitoring algorithms are promising for safety applications.
arXiv Detail & Related papers (2020-08-25T18:21:35Z) - Driver Intention Anticipation Based on In-Cabin and Driving Scene
Monitoring [52.557003792696484]
We present a framework for the detection of the drivers' intention based on both in-cabin and traffic scene videos.
Our framework achieves a prediction with the accuracy of 83.98% and F1-score of 84.3%.
arXiv Detail & Related papers (2020-06-20T11:56:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.