PDB: Not All Drivers Are the Same -- A Personalized Dataset for Understanding Driving Behavior
- URL: http://arxiv.org/abs/2503.06477v1
- Date: Sun, 09 Mar 2025 06:28:39 GMT
- Title: PDB: Not All Drivers Are the Same -- A Personalized Dataset for Understanding Driving Behavior
- Authors: Chuheng Wei, Ziye Qin, Siyan Li, Ziyan Zhang, Xuanpeng Zhao, Amr Abdelraouf, Rohit Gupta, Kyungtae Han, Matthew J. Barth, Guoyuan Wu,
- Abstract summary: The Personalized Driving Behavior dataset is a multi-modal dataset designed to capture personalization in driving behavior under naturalistic driving conditions.<n>The dataset features 12 participants, approximately 270,000 LiDAR frames, 1.6 million images, and 6.6 TB of raw sensor data.<n>By explicitly capturing drivers' behavior, PDB serves as a unique resource for human factor analysis, driver identification, and personalized mobility applications.
- Score: 15.804795085314423
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Driving behavior is inherently personal, influenced by individual habits, decision-making styles, and physiological states. However, most existing datasets treat all drivers as homogeneous, overlooking driver-specific variability. To address this gap, we introduce the Personalized Driving Behavior (PDB) dataset, a multi-modal dataset designed to capture personalization in driving behavior under naturalistic driving conditions. Unlike conventional datasets, PDB minimizes external influences by maintaining consistent routes, vehicles, and lighting conditions across sessions. It includes sources from 128-line LiDAR, front-facing camera video, GNSS, 9-axis IMU, CAN bus data (throttle, brake, steering angle), and driver-specific signals such as facial video and heart rate. The dataset features 12 participants, approximately 270,000 LiDAR frames, 1.6 million images, and 6.6 TB of raw sensor data. The processed trajectory dataset consists of 1,669 segments, each spanning 10 seconds with a 0.2-second interval. By explicitly capturing drivers' behavior, PDB serves as a unique resource for human factor analysis, driver identification, and personalized mobility applications, contributing to the development of human-centric intelligent transportation systems.
Related papers
- DISC: Dataset for Analyzing Driving Styles In Simulated Crashes for Mixed Autonomy [13.365522429680547]
DISC (Driving Styles In Simulated Crashes) is one of the first datasets to capture driving styles in pre-crash scenarios for mixed autonomy analysis.<n> DISC includes over 8 classes of driving styles/behaviors from hundreds of drivers navigating a simulated vehicle.<n>Data was collected through a driver-centric study involving human drivers encountering twelve simulated accident scenarios.
arXiv Detail & Related papers (2025-01-28T15:45:25Z) - VTD: Visual and Tactile Database for Driver State and Behavior Perception [1.6277623188953556]
We introduce a novel visual-tactile perception method to address subjective uncertainties in driver state and interaction behaviors.<n>A comprehensive dataset has been developed that encompasses multi-modal data under fatigue and distraction conditions.
arXiv Detail & Related papers (2024-12-06T09:31:40Z) - Traffic and Safety Rule Compliance of Humans in Diverse Driving Situations [48.924085579865334]
Analyzing human data is crucial for developing autonomous systems that replicate safe driving practices.
This paper presents a comparative evaluation of human compliance with traffic and safety rules across multiple trajectory prediction datasets.
arXiv Detail & Related papers (2024-11-04T09:21:00Z) - D2E-An Autonomous Decision-making Dataset involving Driver States and Human Evaluation [6.890077875318333]
Driver to Evaluation dataset (D2E) is an autonomous decision-making dataset.
It contains data on driver states, vehicle states, environmental situations, and evaluation scores from human reviewers.
D2E contains over 1100 segments of interactive driving case data covering from human driver factor to evaluation results.
arXiv Detail & Related papers (2024-04-12T21:29:18Z) - Situation Awareness for Driver-Centric Driving Style Adaptation [3.568617847600189]
We propose a situation-aware driving style model based on different visual feature encoders pretrained on fleet data.
Our experiments show that the proposed method outperforms static driving styles significantly and forms plausible situation clusters.
arXiv Detail & Related papers (2024-03-28T17:19:16Z) - Leveraging Driver Field-of-View for Multimodal Ego-Trajectory Prediction [69.29802752614677]
RouteFormer is a novel ego-trajectory prediction network combining GPS data, environmental context, and the driver's field-of-view.
To tackle data scarcity and enhance diversity, we introduce GEM, a dataset of urban driving scenarios enriched with synchronized driver field-of-view and gaze data.
arXiv Detail & Related papers (2023-12-13T23:06:30Z) - DeepAccident: A Motion and Accident Prediction Benchmark for V2X
Autonomous Driving [76.29141888408265]
We propose a large-scale dataset containing diverse accident scenarios that frequently occur in real-world driving.
The proposed DeepAccident dataset includes 57K annotated frames and 285K annotated samples, approximately 7 times more than the large-scale nuScenes dataset.
arXiv Detail & Related papers (2023-04-03T17:37:00Z) - Driver Profiling and Bayesian Workload Estimation Using Naturalistic
Peripheral Detection Study Data [40.43737902900321]
We tackle the problem of workload estimation from driving performance data.
Key environmental factors that induce a high mental workload are identified via video analysis.
A supervised learning framework is introduced to profile drivers based on the average workload they experience.
A Bayesian filtering approach is then proposed for sequentially estimating, in (near) real-time, the driver's instantaneous workload.
arXiv Detail & Related papers (2023-03-26T13:15:44Z) - Generative AI-empowered Simulation for Autonomous Driving in Vehicular
Mixed Reality Metaverses [130.15554653948897]
In vehicular mixed reality (MR) Metaverse, distance between physical and virtual entities can be overcome.
Large-scale traffic and driving simulation via realistic data collection and fusion from the physical world is difficult and costly.
We propose an autonomous driving architecture, where generative AI is leveraged to synthesize unlimited conditioned traffic and driving data in simulations.
arXiv Detail & Related papers (2023-02-16T16:54:10Z) - Driving-Signal Aware Full-Body Avatars [49.89791440532946]
We present a learning-based method for building driving-signal aware full-body avatars.
Our model is a conditional variational autoencoder that can be animated with incomplete driving signals.
We demonstrate the efficacy of our approach on the challenging problem of full-body animation for virtual telepresence.
arXiv Detail & Related papers (2021-05-21T16:22:38Z) - Driver2vec: Driver Identification from Automotive Data [44.84876493736275]
Driver2vec is able to accurately identify the driver from a short 10-second interval of sensor data.
Driver2vec is trained on a dataset of 51 drivers provided by Nervtech.
arXiv Detail & Related papers (2021-02-10T03:09:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.