A Physics Enhanced Residual Learning (PERL) Framework for Vehicle Trajectory Prediction
- URL: http://arxiv.org/abs/2309.15284v2
- Date: Thu, 21 Mar 2024 04:36:22 GMT
- Title: A Physics Enhanced Residual Learning (PERL) Framework for Vehicle Trajectory Prediction
- Authors: Keke Long, Zihao Sheng, Haotian Shi, Xiaopeng Li, Sikai Chen, Sue Ahn,
- Abstract summary: PERL integrates the strengths of physics-based and data-driven methods for traffic state prediction.
It preserves the interpretability inherent to physics-based models and has reduced data requirements.
PERL achieves better prediction with a small dataset, compared to the physics model, data-driven model, and PINN model.
- Score: 5.7215490229343535
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In vehicle trajectory prediction, physics models and data-driven models are two predominant methodologies. However, each approach presents its own set of challenges: physics models fall short in predictability, while data-driven models lack interpretability. Addressing these identified shortcomings, this paper proposes a novel framework, the Physics-Enhanced Residual Learning (PERL) model. PERL integrates the strengths of physics-based and data-driven methods for traffic state prediction. PERL contains a physics model and a residual learning model. Its prediction is the sum of the physics model result and a predicted residual as a correction to it. It preserves the interpretability inherent to physics-based models and has reduced data requirements compared to data-driven methods. Experiments were conducted using a real-world vehicle trajectory dataset. We proposed a PERL model, with the Intelligent Driver Model (IDM) as its physics car-following model and Long Short-Term Memory (LSTM) as its residual learning model. We compare this PERL model with the physics car-following model, data-driven model, and other physics-informed neural network (PINN) models. The result reveals that PERL achieves better prediction with a small dataset, compared to the physics model, data-driven model, and PINN model. Second, the PERL model showed faster convergence during training, offering comparable performance with fewer training samples than the data-driven model and PINN model. Sensitivity analysis also proves comparable performance of PERL using another residual learning model and a physics car-following model.
Related papers
- Large language models, physics-based modeling, experimental measurements: the trinity of data-scarce learning of polymer properties [10.955525128731654]
Large language models (LLMs) bear promise as a fast and accurate material modeling paradigm for evaluation, analysis, and design.
We present a physics-based training pipeline that tackles the pathology of data scarcity.
arXiv Detail & Related papers (2024-07-03T02:57:40Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - An active inference model of car following: Advantages and applications [6.905724739762358]
Driver process models play a central role in the testing, verification, and development of automated and autonomous vehicle technologies.
Data-driven machine learning models are more capable than rule-based models but are limited by the need for large training datasets and their lack of interpretability.
We propose a novel car following modeling approach using active inference, which has comparable behavioral flexibility to data-driven models while maintaining interpretability.
arXiv Detail & Related papers (2023-03-27T13:39:26Z) - Dataless Knowledge Fusion by Merging Weights of Language Models [51.8162883997512]
Fine-tuning pre-trained language models has become the prevalent paradigm for building downstream NLP models.
This creates a barrier to fusing knowledge across individual models to yield a better single model.
We propose a dataless knowledge fusion method that merges models in their parameter space.
arXiv Detail & Related papers (2022-12-19T20:46:43Z) - IDM-Follower: A Model-Informed Deep Learning Method for Long-Sequence
Car-Following Trajectory Prediction [24.94160059351764]
Most car-following models are generative and only consider the inputs of the speed, position, and acceleration of the last time step.
We implement a novel structure with two independent encoders and a self-attention decoder that could sequentially predict the following trajectories.
Numerical experiments with multiple settings on simulation and NGSIM datasets show that the IDM-Follower can improve the prediction performance.
arXiv Detail & Related papers (2022-10-20T02:24:27Z) - Measuring Causal Effects of Data Statistics on Language Model's
`Factual' Predictions [59.284907093349425]
Large amounts of training data are one of the major reasons for the high performance of state-of-the-art NLP models.
We provide a language for describing how training data influences predictions, through a causal framework.
Our framework bypasses the need to retrain expensive models and allows us to estimate causal effects based on observational data alone.
arXiv Detail & Related papers (2022-07-28T17:36:24Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Hybrid Physics and Deep Learning Model for Interpretable Vehicle State
Prediction [75.1213178617367]
We propose a hybrid approach combining deep learning and physical motion models.
We achieve interpretability by restricting the output range of the deep neural network as part of the hybrid model.
The results show that our hybrid model can improve model interpretability with no decrease in accuracy compared to existing deep learning approaches.
arXiv Detail & Related papers (2021-03-11T15:21:08Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - A Physics-Informed Deep Learning Paradigm for Car-Following Models [3.093890460224435]
We develop a family of neural network based car-following models informed by physics-based models.
Two types of PIDL-CFM problems are studied, one to predict acceleration only and the other to jointly predict acceleration and discover model parameters.
The results demonstrate the superior performance of neural networks informed by physics over those without.
arXiv Detail & Related papers (2020-12-24T18:04:08Z) - The Importance of Balanced Data Sets: Analyzing a Vehicle Trajectory
Prediction Model based on Neural Networks and Distributed Representations [0.0]
We investigate the composition of training data in vehicle trajectory prediction.
We show that the models employing our semantic vector representation outperform the numerical model when trained on an adequate data set.
arXiv Detail & Related papers (2020-09-30T20:00:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.