PSI: A Pedestrian Behavior Dataset for Socially Intelligent Autonomous
Car
- URL: http://arxiv.org/abs/2112.02604v1
- Date: Sun, 5 Dec 2021 15:54:57 GMT
- Title: PSI: A Pedestrian Behavior Dataset for Socially Intelligent Autonomous
Car
- Authors: Tina Chen, Renran Tian, Yaobin Chen, Joshua Domeyer, Heishiro Toyoda,
Rini Sherony, Taotao Jing, Zhengming Ding
- Abstract summary: This paper proposes and shares another benchmark dataset called the IUPUI-CSRC Pedestrian Situated Intent (PSI) data.
The first novel label is the dynamic intent changes for the pedestrians to cross in front of the ego-vehicle, achieved from 24 drivers.
The second one is the text-based explanations of the driver reasoning process when estimating pedestrian intents and predicting their behaviors.
- Score: 47.01116716025731
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Prediction of pedestrian behavior is critical for fully autonomous vehicles
to drive in busy city streets safely and efficiently. The future autonomous
cars need to fit into mixed conditions with not only technical but also social
capabilities. As more algorithms and datasets have been developed to predict
pedestrian behaviors, these efforts lack the benchmark labels and the
capability to estimate the temporal-dynamic intent changes of the pedestrians,
provide explanations of the interaction scenes, and support algorithms with
social intelligence. This paper proposes and shares another benchmark dataset
called the IUPUI-CSRC Pedestrian Situated Intent (PSI) data with two innovative
labels besides comprehensive computer vision labels. The first novel label is
the dynamic intent changes for the pedestrians to cross in front of the
ego-vehicle, achieved from 24 drivers with diverse backgrounds. The second one
is the text-based explanations of the driver reasoning process when estimating
pedestrian intents and predicting their behaviors during the interaction
period. These innovative labels can enable several computer vision tasks,
including pedestrian intent/behavior prediction, vehicle-pedestrian interaction
segmentation, and video-to-language mapping for explainable algorithms. The
released dataset can fundamentally improve the development of pedestrian
behavior prediction models and develop socially intelligent autonomous cars to
interact with pedestrians efficiently. The dataset has been evaluated with
different tasks and is released to the public to access.
Related papers
- Pedestrian motion prediction evaluation for urban autonomous driving [0.0]
We analyze selected publications with provided open-source solutions to determine valuability of traditional motion prediction metrics.
This perspective should be valuable to any potential autonomous driving or robotics engineer looking for the real-world performance of the existing state-of-art pedestrian motion prediction problem.
arXiv Detail & Related papers (2024-10-22T10:06:50Z) - A low complexity contextual stacked ensemble-learning approach for pedestrian intent prediction [2.443659506850567]
Current research leverages computer vision and machine learning advances to predict near-misses.
This work proposes a low-complexity ensemble-learning approach that employs contextual data for predicting the pedestrian's intent for crossing.
Our experiments on different datasets achieve similar pedestrian intent prediction performance as the state-of-the-art approaches.
arXiv Detail & Related papers (2024-10-16T21:02:24Z) - Social-Transmotion: Promptable Human Trajectory Prediction [65.80068316170613]
Social-Transmotion is a generic Transformer-based model that exploits diverse and numerous visual cues to predict human behavior.
Our approach is validated on multiple datasets, including JTA, JRDB, Pedestrians and Cyclists in Road Traffic, and ETH-UCY.
arXiv Detail & Related papers (2023-12-26T18:56:49Z) - GPT-4V Takes the Wheel: Promises and Challenges for Pedestrian Behavior
Prediction [12.613528624623514]
This research is the first to conduct both quantitative and qualitative evaluations of Vision Language Models (VLMs) in the context of pedestrian behavior prediction for autonomous driving.
We evaluate GPT-4V on publicly available pedestrian datasets: JAAD and WiDEVIEW.
The model achieves a 57% accuracy in a zero-shot manner, which, while impressive, is still behind the state-of-the-art domain-specific models (70%) in predicting pedestrian crossing actions.
arXiv Detail & Related papers (2023-11-24T18:02:49Z) - PedFormer: Pedestrian Behavior Prediction via Cross-Modal Attention
Modulation and Gated Multitask Learning [10.812772606528172]
We propose a novel framework that relies on different data modalities to predict future trajectories and crossing actions of pedestrians from an ego-centric perspective.
We show that our model improves state-of-the-art in trajectory and action prediction by up to 22% and 13% respectively on various metrics.
arXiv Detail & Related papers (2022-10-14T15:12:00Z) - Pedestrian Stop and Go Forecasting with Hybrid Feature Fusion [87.77727495366702]
We introduce the new task of pedestrian stop and go forecasting.
Considering the lack of suitable existing datasets for it, we release TRANS, a benchmark for explicitly studying the stop and go behaviors of pedestrians in urban traffic.
We build it from several existing datasets annotated with pedestrians' walking motions, in order to have various scenarios and behaviors.
arXiv Detail & Related papers (2022-03-04T18:39:31Z) - SSAGCN: Social Soft Attention Graph Convolution Network for Pedestrian
Trajectory Prediction [59.064925464991056]
We propose one new prediction model named Social Soft Attention Graph Convolution Network (SSAGCN)
SSAGCN aims to simultaneously handle social interactions among pedestrians and scene interactions between pedestrians and environments.
Experiments on public available datasets prove the effectiveness of SSAGCN and have achieved state-of-the-art results.
arXiv Detail & Related papers (2021-12-05T01:49:18Z) - Pedestrian Behavior Prediction for Automated Driving: Requirements,
Metrics, and Relevant Features [1.1888947789336193]
We analyze the requirements on pedestrian behavior prediction for automated driving via a system-level approach.
Based on human driving behavior we derive appropriate reaction patterns of an automated vehicle.
We present a pedestrian prediction model based on a Variational Conditional Auto-Encoder which incorporates multiple contextual cues.
arXiv Detail & Related papers (2020-12-15T16:52:49Z) - Pedestrian Intention Prediction: A Multi-task Perspective [83.7135926821794]
In order to be globally deployed, autonomous cars must guarantee the safety of pedestrians.
This work tries to solve this problem by jointly predicting the intention and visual states of pedestrians.
The method is a recurrent neural network in a multi-task learning approach.
arXiv Detail & Related papers (2020-10-20T13:42:31Z) - Spatiotemporal Relationship Reasoning for Pedestrian Intent Prediction [57.56466850377598]
Reasoning over visual data is a desirable capability for robotics and vision-based applications.
In this paper, we present a framework on graph to uncover relationships in different objects in the scene for reasoning about pedestrian intent.
Pedestrian intent, defined as the future action of crossing or not-crossing the street, is a very crucial piece of information for autonomous vehicles.
arXiv Detail & Related papers (2020-02-20T18:50:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.