Behavioural gap assessment of human-vehicle interaction in real and virtual reality-based scenarios in autonomous driving
- URL: http://arxiv.org/abs/2407.04070v1
- Date: Thu, 4 Jul 2024 17:20:17 GMT
- Title: Behavioural gap assessment of human-vehicle interaction in real and virtual reality-based scenarios in autonomous driving
- Authors: Sergio. Martín Serrano, Rubén Izquierdo, Iván García Daza, Miguel Ángel Sotelo, D. Fernández Llorca,
- Abstract summary: We present a first and innovative approach to evaluating what we term the behavioural gap, a concept that captures the disparity in a participant's conduct when engaging in a VR experiment compared to an equivalent real-world situation.
In the experiment, the pedestrian attempts to cross the road in the presence of different driving styles and an external Human-Machine Interface (eHMI)
Results show that participants are more cautious and curious in VR, affecting their speed and decisions, and that VR interfaces significantly influence their actions.
- Score: 7.588679613436823
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the field of autonomous driving research, the use of immersive virtual reality (VR) techniques is widespread to enable a variety of studies under safe and controlled conditions. However, this methodology is only valid and consistent if the conduct of participants in the simulated setting mirrors their actions in an actual environment. In this paper, we present a first and innovative approach to evaluating what we term the behavioural gap, a concept that captures the disparity in a participant's conduct when engaging in a VR experiment compared to an equivalent real-world situation. To this end, we developed a digital twin of a pre-existed crosswalk and carried out a field experiment (N=18) to investigate pedestrian-autonomous vehicle interaction in both real and simulated driving conditions. In the experiment, the pedestrian attempts to cross the road in the presence of different driving styles and an external Human-Machine Interface (eHMI). By combining survey-based and behavioural analysis methodologies, we develop a quantitative approach to empirically assess the behavioural gap, as a mechanism to validate data obtained from real subjects interacting in a simulated VR-based environment. Results show that participants are more cautious and curious in VR, affecting their speed and decisions, and that VR interfaces significantly influence their actions.
Related papers
- Analyze Drivers' Intervention Behavior During Autonomous Driving -- A
VR-incorporated Approach [2.7532019227694344]
This work sheds light on understanding human drivers' intervention behavior involved in the operation of autonomous vehicles.
Experiment environments were implemented where the virtual reality (VR) and traffic micro-simulation are integrated.
Performance indicators such as the probability of intervention, accident rates are defined and used to quantify and compare the risk levels.
arXiv Detail & Related papers (2023-12-04T06:36:57Z) - How Simulation Helps Autonomous Driving:A Survey of Sim2real, Digital
Twins, and Parallel Intelligence [16.24370001383615]
How to adapt driving knowledge learned in simulation to reality becomes a critical issue.
Virtual simulation world differs from the real world in many aspects such as lighting, textures, vehicle dynamics, and agents' behaviors.
Three categories of approaches to address the reality gap issue: transferring knowledge from simulation to reality (sim2real), learning in digital twins (DTs), and learning by parallel intelligence (PI)
arXiv Detail & Related papers (2023-05-02T09:00:32Z) - Generative AI-empowered Simulation for Autonomous Driving in Vehicular
Mixed Reality Metaverses [130.15554653948897]
In vehicular mixed reality (MR) Metaverse, distance between physical and virtual entities can be overcome.
Large-scale traffic and driving simulation via realistic data collection and fusion from the physical world is difficult and costly.
We propose an autonomous driving architecture, where generative AI is leveraged to synthesize unlimited conditioned traffic and driving data in simulations.
arXiv Detail & Related papers (2023-02-16T16:54:10Z) - Force-Aware Interface via Electromyography for Natural VR/AR Interaction [69.1332992637271]
We design a learning-based neural interface for natural and intuitive force inputs in VR/AR.
We show that our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration.
We envision our findings to push forward research towards more realistic physicality in future VR/AR.
arXiv Detail & Related papers (2022-10-03T20:51:25Z) - Mind the Gap! A Study on the Transferability of Virtual vs
Physical-world Testing of Autonomous Driving Systems [6.649715954440713]
We leverage the Donkey Car open-source framework to empirically compare testing of SDCs when deployed on a physical small-scale vehicle vs its virtual simulated counterpart.
While a large number of testing results do transfer between virtual and physical environments, we also identified critical shortcomings that contribute to the reality gap between the virtual and physical world.
arXiv Detail & Related papers (2021-12-21T14:28:35Z) - Learning Interactive Driving Policies via Data-driven Simulation [125.97811179463542]
Data-driven simulators promise high data-efficiency for driving policy learning.
Small underlying datasets often lack interesting and challenging edge cases for learning interactive driving.
We propose a simulation method that uses in-painted ado vehicles for learning robust driving policies.
arXiv Detail & Related papers (2021-11-23T20:14:02Z) - Towards Optimal Strategies for Training Self-Driving Perception Models
in Simulation [98.51313127382937]
We focus on the use of labels in the synthetic domain alone.
Our approach introduces both a way to learn neural-invariant representations and a theoretically inspired view on how to sample the data from the simulator.
We showcase our approach on the bird's-eye-view vehicle segmentation task with multi-sensor data.
arXiv Detail & Related papers (2021-11-15T18:37:43Z) - Game and Simulation Design for Studying Pedestrian-Automated Vehicle
Interactions [1.3764085113103217]
We first present contemporary tools in the field and then propose the design and development of a new application that facilitates pedestrian point of view research.
We conduct a three-step user experience experiment where participants answer questions before and after using the application in various scenarios.
arXiv Detail & Related papers (2021-09-30T15:26:18Z) - Studying Person-Specific Pointing and Gaze Behavior for Multimodal
Referencing of Outside Objects from a Moving Vehicle [58.720142291102135]
Hand pointing and eye gaze have been extensively investigated in automotive applications for object selection and referencing.
Existing outside-the-vehicle referencing methods focus on a static situation, whereas the situation in a moving vehicle is highly dynamic and subject to safety-critical constraints.
We investigate the specific characteristics of each modality and the interaction between them when used in the task of referencing outside objects.
arXiv Detail & Related papers (2020-09-23T14:56:19Z) - Building Trust in Autonomous Vehicles: Role of Virtual Reality Driving
Simulators in HMI Design [8.39368916644651]
We propose a methodology to validate the user experience in AVs based on continuous, objective information gathered from physiological signals.
We applied this methodology to the design of a head-up display interface delivering visual cues about the vehicle's sensory and planning systems.
arXiv Detail & Related papers (2020-07-27T08:42:07Z) - RoboTHOR: An Open Simulation-to-Real Embodied AI Platform [56.50243383294621]
We introduce RoboTHOR to democratize research in interactive and embodied visual AI.
We show there exists a significant gap between the performance of models trained in simulation when they are tested in both simulations and their carefully constructed physical analogs.
arXiv Detail & Related papers (2020-04-14T20:52:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.