"If you could see me through my eyes": Predicting Pedestrian Perception
- URL: http://arxiv.org/abs/2202.13981v1
- Date: Mon, 28 Feb 2022 17:36:12 GMT
- Title: "If you could see me through my eyes": Predicting Pedestrian Perception
- Authors: Julian Petzold, Mostafa Wahby, Franek Stark, Ulrich Behrje, Heiko
Hamann
- Abstract summary: We use synthetic data from simulations of aspecific pedestrian crossing scenario to train avariational autoencoder and along short-term memory network.
We can accurately predict apedestrian's future perceptions within relevant time horizons.
Such trained networks can later be used to predict pedestrian behaviors even from the perspective of the autonomous car.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Pedestrians are particularly vulnerable road users in urban traffic. With the
arrival of autonomous driving, novel technologies can be developed specifically
to protect pedestrians. We propose a~machine learning toolchain to train
artificial neural networks as models of pedestrian behavior. In a~preliminary
study, we use synthetic data from simulations of a~specific pedestrian crossing
scenario to train a~variational autoencoder and a~long short-term memory
network to predict a~pedestrian's future visual perception. We can accurately
predict a~pedestrian's future perceptions within relevant time horizons. By
iteratively feeding these predicted frames into these networks, they can be
used as simulations of pedestrians as indicated by our results. Such trained
networks can later be used to predict pedestrian behaviors even from the
perspective of the autonomous car. Another future extension will be to re-train
these networks with real-world video data.
Related papers
- Humanoid Locomotion as Next Token Prediction [84.21335675130021]
Our model is a causal transformer trained via autoregressive prediction of sensorimotor trajectories.
We show that our model enables a full-sized humanoid to walk in San Francisco zero-shot.
Our model can transfer to the real world even when trained on only 27 hours of walking data, and can generalize commands not seen during training like walking backward.
arXiv Detail & Related papers (2024-02-29T18:57:37Z) - Social-Transmotion: Promptable Human Trajectory Prediction [65.80068316170613]
Social-Transmotion is a generic Transformer-based model that exploits diverse and numerous visual cues to predict human behavior.
Our approach is validated on multiple datasets, including JTA, JRDB, Pedestrians and Cyclists in Road Traffic, and ETH-UCY.
arXiv Detail & Related papers (2023-12-26T18:56:49Z) - Pedestrian 3D Bounding Box Prediction [83.7135926821794]
We focus on 3D bounding boxes, which are reasonable estimates of humans without modeling complex motion details for autonomous vehicles.
We suggest this new problem and present a simple yet effective model for pedestrians' 3D bounding box prediction.
This method follows an encoder-decoder architecture based on recurrent neural networks.
arXiv Detail & Related papers (2022-06-28T17:59:45Z) - Pedestrian Stop and Go Forecasting with Hybrid Feature Fusion [87.77727495366702]
We introduce the new task of pedestrian stop and go forecasting.
Considering the lack of suitable existing datasets for it, we release TRANS, a benchmark for explicitly studying the stop and go behaviors of pedestrians in urban traffic.
We build it from several existing datasets annotated with pedestrians' walking motions, in order to have various scenarios and behaviors.
arXiv Detail & Related papers (2022-03-04T18:39:31Z) - PSI: A Pedestrian Behavior Dataset for Socially Intelligent Autonomous
Car [47.01116716025731]
This paper proposes and shares another benchmark dataset called the IUPUI-CSRC Pedestrian Situated Intent (PSI) data.
The first novel label is the dynamic intent changes for the pedestrians to cross in front of the ego-vehicle, achieved from 24 drivers.
The second one is the text-based explanations of the driver reasoning process when estimating pedestrian intents and predicting their behaviors.
arXiv Detail & Related papers (2021-12-05T15:54:57Z) - PredictionNet: Real-Time Joint Probabilistic Traffic Prediction for
Planning, Control, and Simulation [9.750094897470447]
PredictionNet is a deep neural network (DNN) that predicts the motion of all surrounding traffic agents together with the ego-vehicle's motion.
The network can be used to simulate realistic traffic, and it produces competitive results on popular benchmarks.
It has been used to successfully control a real-world vehicle for hundreds of kilometers, by combining it with a motion planning/control subsystem.
arXiv Detail & Related papers (2021-09-23T01:23:47Z) - Pedestrian Intention Prediction: A Multi-task Perspective [83.7135926821794]
In order to be globally deployed, autonomous cars must guarantee the safety of pedestrians.
This work tries to solve this problem by jointly predicting the intention and visual states of pedestrians.
The method is a recurrent neural network in a multi-task learning approach.
arXiv Detail & Related papers (2020-10-20T13:42:31Z) - VRUNet: Multi-Task Learning Model for Intent Prediction of Vulnerable
Road Users [3.6265173818019947]
We propose a multi-task learning model to predict pedestrian actions, crossing intent and forecast their future path from video sequences.
We have trained the model on naturalistic driving open-source JAAD dataset, which is rich in behavioral annotations and real world scenarios.
Experimental results show state-of-the-art performance on JAAD dataset.
arXiv Detail & Related papers (2020-07-10T14:02:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.