Comparison of Pedestrian Prediction Models from Trajectory and
Appearance Data for Autonomous Driving
- URL: http://arxiv.org/abs/2305.15942v1
- Date: Thu, 25 May 2023 11:24:38 GMT
- Title: Comparison of Pedestrian Prediction Models from Trajectory and
Appearance Data for Autonomous Driving
- Authors: Anthony Knittel, Morris Antonello, John Redford and Subramanian
Ramamoorthy
- Abstract summary: The ability to anticipate pedestrian motion changes is a critical capability for autonomous vehicles.
In urban environments, pedestrians may enter the road area and create a high risk for driving.
This work presents a comparative evaluation of trajectory-only and appearance-based methods for pedestrian prediction.
- Score: 13.126949982768505
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The ability to anticipate pedestrian motion changes is a critical capability
for autonomous vehicles. In urban environments, pedestrians may enter the road
area and create a high risk for driving, and it is important to identify these
cases. Typical predictors use the trajectory history to predict future motion,
however in cases of motion initiation, motion in the trajectory may only be
clearly visible after a delay, which can result in the pedestrian has entered
the road area before an accurate prediction can be made. Appearance data
includes useful information such as changes of gait, which are early indicators
of motion changes, and can inform trajectory prediction. This work presents a
comparative evaluation of trajectory-only and appearance-based methods for
pedestrian prediction, and introduces a new dataset experiment for prediction
using appearance. We create two trajectory and image datasets based on the
combination of image and trajectory sequences from the popular NuScenes
dataset, and examine prediction of trajectories using observed appearance to
influence futures. This shows some advantages over trajectory prediction alone,
although problems with the dataset prevent advantages of appearance-based
models from being shown. We describe methods for improving the dataset and
experiment to allow benefits of appearance-based models to be captured.
Related papers
- Context-aware Multi-task Learning for Pedestrian Intent and Trajectory Prediction [3.522062800701924]
We introduce PTINet, which learns trajectory and intention prediction by combining past trajectory observations, local contextual features, and global features.
The efficacy of our approach is evaluated on widely used public datasets: JAAD and PIE.
PTINet paves the way for the development of automated systems capable of seamlessly interacting with pedestrians in urban settings.
arXiv Detail & Related papers (2024-07-24T11:06:47Z) - Knowledge-aware Graph Transformer for Pedestrian Trajectory Prediction [15.454206825258169]
Predicting pedestrian motion trajectories is crucial for path planning and motion control of autonomous vehicles.
Recent deep learning-based prediction approaches mainly utilize information like trajectory history and interactions between pedestrians.
This paper proposes a graph transformer structure to improve prediction performance.
arXiv Detail & Related papers (2024-01-10T01:50:29Z) - Pre-training on Synthetic Driving Data for Trajectory Prediction [64.16991399882477]
We aim to tackle the challenge of learning general trajectory forecasting representations under limited data availability.
We take advantage of graph representations of HD-map and apply vector transformations to reshape the maps.
We employ a rule-based model to generate trajectories based on augmented scenes.
arXiv Detail & Related papers (2023-09-18T19:49:22Z) - Action-based Contrastive Learning for Trajectory Prediction [4.675212251005813]
Trajectory prediction is an essential task for successful human robot interaction, such as in autonomous driving.
In this work, we address the problem of predicting future pedestrian trajectories in a first person view setting with a moving camera.
We propose a novel action-based contrastive learning loss, that utilizes pedestrian action information to improve the learned trajectory embeddings.
arXiv Detail & Related papers (2022-07-18T15:02:27Z) - Trajectory Forecasting from Detection with Uncertainty-Aware Motion
Encoding [121.66374635092097]
Trajectories obtained from object detection and tracking are inevitably noisy.
We propose a trajectory predictor directly based on detection results without relying on explicitly formed trajectories.
arXiv Detail & Related papers (2022-02-03T09:09:56Z) - You Mostly Walk Alone: Analyzing Feature Attribution in Trajectory
Prediction [52.442129609979794]
Recent deep learning approaches for trajectory prediction show promising performance.
It remains unclear which features such black-box models actually learn to use for making predictions.
This paper proposes a procedure that quantifies the contributions of different cues to model performance.
arXiv Detail & Related papers (2021-10-11T14:24:15Z) - Self-Supervised Action-Space Prediction for Automated Driving [0.0]
We present a novel learned multi-modal trajectory prediction architecture for automated driving.
It achieves kinematically feasible predictions by casting the learning problem into the space of accelerations and steering angles.
The proposed methods are evaluated on real-world datasets containing urban intersections and roundabouts.
arXiv Detail & Related papers (2021-09-21T08:27:56Z) - SGCN:Sparse Graph Convolution Network for Pedestrian Trajectory
Prediction [64.16212996247943]
We present a Sparse Graph Convolution Network(SGCN) for pedestrian trajectory prediction.
Specifically, the SGCN explicitly models the sparse directed interaction with a sparse directed spatial graph to capture adaptive interaction pedestrians.
visualizations indicate that our method can capture adaptive interactions between pedestrians and their effective motion tendencies.
arXiv Detail & Related papers (2021-04-04T03:17:42Z) - PePScenes: A Novel Dataset and Baseline for Pedestrian Action Prediction
in 3D [10.580548257913843]
We propose a new pedestrian action prediction dataset created by adding per-frame 2D/3D bounding box and behavioral annotations to nuScenes.
In addition, we propose a hybrid neural network architecture that incorporates various data modalities for predicting pedestrian crossing action.
arXiv Detail & Related papers (2020-12-14T18:13:44Z) - AutoTrajectory: Label-free Trajectory Extraction and Prediction from
Videos using Dynamic Points [92.91569287889203]
We present a novel, label-free algorithm, AutoTrajectory, for trajectory extraction and prediction.
To better capture the moving objects in videos, we introduce dynamic points.
We aggregate dynamic points to instance points, which stand for moving objects such as pedestrians in videos.
arXiv Detail & Related papers (2020-07-11T08:43:34Z) - Spatiotemporal Relationship Reasoning for Pedestrian Intent Prediction [57.56466850377598]
Reasoning over visual data is a desirable capability for robotics and vision-based applications.
In this paper, we present a framework on graph to uncover relationships in different objects in the scene for reasoning about pedestrian intent.
Pedestrian intent, defined as the future action of crossing or not-crossing the street, is a very crucial piece of information for autonomous vehicles.
arXiv Detail & Related papers (2020-02-20T18:50:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.