Probabilistic Symmetry for Multi-Agent Dynamics
- URL: http://arxiv.org/abs/2205.01927v3
- Date: Thu, 18 May 2023 19:29:18 GMT
- Title: Probabilistic Symmetry for Multi-Agent Dynamics
- Authors: Sophia Sun, Robin Walters, Jinxi Li, Rose Yu
- Abstract summary: We propose a novel deep dynamics model, Probabilistic Equivariant Continuous COnvolution (PECCO) for probabilistic prediction of multi-agent trajectories.
PECCO shows significant improvements in accuracy and calibration compared to non-equivariant baselines.
- Score: 18.94585103009698
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning multi-agent dynamics is a core AI problem with broad applications in
robotics and autonomous driving. While most existing works focus on
deterministic prediction, producing probabilistic forecasts to quantify
uncertainty and assess risks is critical for downstream decision-making tasks
such as motion planning and collision avoidance. Multi-agent dynamics often
contains internal symmetry. By leveraging symmetry, specifically rotation
equivariance, we can improve not only the prediction accuracy but also
uncertainty calibration. We introduce Energy Score, a proper scoring rule, to
evaluate probabilistic predictions. We propose a novel deep dynamics model,
Probabilistic Equivariant Continuous COnvolution (PECCO) for probabilistic
prediction of multi-agent trajectories. PECCO extends equivariant continuous
convolution to model the joint velocity distribution of multiple agents. It
uses dynamics integration to propagate the uncertainty from velocity to
position. On both synthetic and real-world datasets, PECCO shows significant
improvements in accuracy and calibration compared to non-equivariant baselines.
Related papers
- Uncertainty-Aware Pedestrian Trajectory Prediction via Distributional Diffusion [26.715578412088327]
We present a model-agnostic uncertainty-aware pedestrian trajectory prediction framework.
Unlike previous studies, we translate the predictiveity to explicit distributions, allowing it to generate plausible future trajectories.
Our framework is compatible with different neural net architectures.
arXiv Detail & Related papers (2023-03-15T04:58:43Z) - Scalable Dynamic Mixture Model with Full Covariance for Probabilistic
Traffic Forecasting [16.04029885574568]
We propose a dynamic mixture of zero-mean Gaussian distributions for the time-varying error process.
The proposed method can be seamlessly integrated into existing deep-learning frameworks with only a few additional parameters to be learned.
We evaluate the proposed method on a traffic speed forecasting task and find that our method not only improves model horizons but also provides interpretabletemporal correlation structures.
arXiv Detail & Related papers (2022-12-10T22:50:00Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - Probabilistic Gradient Boosting Machines for Large-Scale Probabilistic
Regression [51.770998056563094]
Probabilistic Gradient Boosting Machines (PGBM) is a method to create probabilistic predictions with a single ensemble of decision trees.
We empirically demonstrate the advantages of PGBM compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-03T08:32:13Z) - Heterogeneous-Agent Trajectory Forecasting Incorporating Class
Uncertainty [54.88405167739227]
We present HAICU, a method for heterogeneous-agent trajectory forecasting that explicitly incorporates agents' class probabilities.
We additionally present PUP, a new challenging real-world autonomous driving dataset.
We demonstrate that incorporating class probabilities in trajectory forecasting significantly improves performance in the face of uncertainty.
arXiv Detail & Related papers (2021-04-26T10:28:34Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Probabilistic electric load forecasting through Bayesian Mixture Density
Networks [70.50488907591463]
Probabilistic load forecasting (PLF) is a key component in the extended tool-chain required for efficient management of smart energy grids.
We propose a novel PLF approach, framed on Bayesian Mixture Density Networks.
To achieve reliable and computationally scalable estimators of the posterior distributions, both Mean Field variational inference and deep ensembles are integrated.
arXiv Detail & Related papers (2020-12-23T16:21:34Z) - Trajectory Prediction using Equivariant Continuous Convolution [21.29323790182332]
Trajectory prediction is a critical part of many AI applications, for example, the safe operation of autonomous vehicles.
We propose a novel model, Equivariant Continous COnvolution (ECCO) for improved trajectory prediction.
On both vehicle and pedestrian trajectory datasets, ECCO attains competitive accuracy with significantly fewer parameters.
arXiv Detail & Related papers (2020-10-21T23:18:42Z) - Estimation of Accurate and Calibrated Uncertainties in Deterministic
models [0.8702432681310401]
We devise a method to transform a deterministic prediction into a probabilistic one.
We show that for doing so, one has to compromise between the accuracy and the reliability (calibration) of such a model.
We show several examples both with synthetic data, where the underlying hidden noise can accurately be recovered, and with large real-world datasets.
arXiv Detail & Related papers (2020-03-11T04:02:56Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.