A Neuro-Symbolic Approach for Enhanced Human Motion Prediction
- URL: http://arxiv.org/abs/2304.11740v1
- Date: Sun, 23 Apr 2023 20:11:40 GMT
- Title: A Neuro-Symbolic Approach for Enhanced Human Motion Prediction
- Authors: Sariah Mghames, Luca Castri, Marc Hanheide, Nicola Bellotto
- Abstract summary: We propose a neuro-symbolic approach for human motion prediction (NeuroSyM)
NeuroSyM weights differently the interactions in the neighbourhood by leveraging an intuitive technique for spatial representation called qualitative Trajectory Calculus (QTC)
Experimental results show that the NeuroSyM approach outperforms in most cases the baseline architectures in terms of prediction accuracy.
- Score: 5.742409080817885
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Reasoning on the context of human beings is crucial for many real-world
applications especially for those deploying autonomous systems (e.g. robots).
In this paper, we present a new approach for context reasoning to further
advance the field of human motion prediction. We therefore propose a
neuro-symbolic approach for human motion prediction (NeuroSyM), which weights
differently the interactions in the neighbourhood by leveraging an intuitive
technique for spatial representation called Qualitative Trajectory Calculus
(QTC). The proposed approach is experimentally tested on medium and long term
time horizons using two architectures from the state of art, one of which is a
baseline for human motion prediction and the other is a baseline for generic
multivariate time-series prediction. Six datasets of challenging crowded
scenarios, collected from both fixed and mobile cameras, were used for testing.
Experimental results show that the NeuroSyM approach outperforms in most cases
the baseline architectures in terms of prediction accuracy.
Related papers
- Social-Transmotion: Promptable Human Trajectory Prediction [65.80068316170613]
Social-Transmotion is a generic Transformer-based model that exploits diverse and numerous visual cues to predict human behavior.
Our approach is validated on multiple datasets, including JTA, JRDB, Pedestrians and Cyclists in Road Traffic, and ETH-UCY.
arXiv Detail & Related papers (2023-12-26T18:56:49Z) - Qualitative Prediction of Multi-Agent Spatial Interactions [5.742409080817885]
We present and benchmark three new approaches to model and predict multi-agent interactions in dense scenes.
The proposed solutions take into account static and dynamic context to predict individual interactions.
They exploit an input- and a temporal-attention mechanism, and are tested on medium and long-term time horizons.
arXiv Detail & Related papers (2023-06-30T18:08:25Z) - Neural Foundations of Mental Simulation: Future Prediction of Latent
Representations on Dynamic Scenes [3.2744507958793143]
We combine a goal-driven modeling approach with dense neurophysiological data and human behavioral readouts to impinge on this question.
Specifically, we construct and evaluate several classes of sensory-cognitive networks to predict the future state of rich, ethologically-relevant environments.
We find strong differentiation across these model classes in their ability to predict neural and behavioral data both within and across diverse environments.
arXiv Detail & Related papers (2023-05-19T15:56:06Z) - Investigating Pose Representations and Motion Contexts Modeling for 3D
Motion Prediction [63.62263239934777]
We conduct an indepth study on various pose representations with a focus on their effects on the motion prediction task.
We propose a novel RNN architecture termed AHMR (Attentive Hierarchical Motion Recurrent network) for motion prediction.
Our approach outperforms the state-of-the-art methods in short-term prediction and achieves much enhanced long-term prediction proficiency.
arXiv Detail & Related papers (2021-12-30T10:45:22Z) - Learning Human Motion Prediction via Stochastic Differential Equations [19.30774202476477]
We propose a novel approach in modeling the motion prediction problem based on differential equations and path integrals.
It achieves a 12.48% accuracy improvement over current state-of-the-art methods in average.
arXiv Detail & Related papers (2021-12-21T11:55:13Z) - Dyadic Human Motion Prediction [119.3376964777803]
We introduce a motion prediction framework that explicitly reasons about the interactions of two observed subjects.
Specifically, we achieve this by introducing a pairwise attention mechanism that models the mutual dependencies in the motion history of the two subjects.
This allows us to preserve the long-term motion dynamics in a more realistic way and more robustly predict unusual and fast-paced movements.
arXiv Detail & Related papers (2021-12-01T10:30:40Z) - Probabilistic Human Motion Prediction via A Bayesian Neural Network [71.16277790708529]
We propose a probabilistic model for human motion prediction in this paper.
Our model could generate several future motions when given an observed motion sequence.
We extensively validate our approach on a large scale benchmark dataset Human3.6m.
arXiv Detail & Related papers (2021-07-14T09:05:33Z) - Long Term Motion Prediction Using Keyposes [122.22758311506588]
We argue that, to achieve long term forecasting, predicting human pose at every time instant is unnecessary.
We call such poses "keyposes", and approximate complex motions by linearly interpolating between subsequent keyposes.
We show that learning the sequence of such keyposes allows us to predict very long term motion, up to 5 seconds in the future.
arXiv Detail & Related papers (2020-12-08T20:45:51Z) - Motion Prediction Using Temporal Inception Module [96.76721173517895]
We propose a Temporal Inception Module (TIM) to encode human motion.
Our framework produces input embeddings using convolutional layers, by using different kernel sizes for different input lengths.
The experimental results on standard motion prediction benchmark datasets Human3.6M and CMU motion capture dataset show that our approach consistently outperforms the state of the art methods.
arXiv Detail & Related papers (2020-10-06T20:26:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.