OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising
- URL: http://arxiv.org/abs/2404.02227v1
- Date: Tue, 2 Apr 2024 18:30:29 GMT
- Title: OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising
- Authors: Haichao Zhang, Yi Xu, Hongsheng Lu, Takayuki Shimizu, Yun Fu,
- Abstract summary: Trajectory prediction is fundamental in computer vision and autonomous driving.
Existing approaches in this field often assume precise and complete observational data.
We present a novel method for out-of-sight trajectory prediction that leverages a vision-positioning technique.
- Score: 49.86409475232849
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Trajectory prediction is fundamental in computer vision and autonomous driving, particularly for understanding pedestrian behavior and enabling proactive decision-making. Existing approaches in this field often assume precise and complete observational data, neglecting the challenges associated with out-of-view objects and the noise inherent in sensor data due to limited camera range, physical obstructions, and the absence of ground truth for denoised sensor data. Such oversights are critical safety concerns, as they can result in missing essential, non-visible objects. To bridge this gap, we present a novel method for out-of-sight trajectory prediction that leverages a vision-positioning technique. Our approach denoises noisy sensor observations in an unsupervised manner and precisely maps sensor-based trajectories of out-of-sight objects into visual trajectories. This method has demonstrated state-of-the-art performance in out-of-sight noisy sensor trajectory denoising and prediction on the Vi-Fi and JRDB datasets. By enhancing trajectory prediction accuracy and addressing the challenges of out-of-sight objects, our work significantly contributes to improving the safety and reliability of autonomous driving in complex environments. Our work represents the first initiative towards Out-Of-Sight Trajectory prediction (OOSTraj), setting a new benchmark for future research. The code is available at \url{https://github.com/Hai-chao-Zhang/OOSTraj}.
Related papers
- Navigation under uncertainty: Trajectory prediction and occlusion reasoning with switching dynamical systems [36.18758962312406]
We propose a conceptual framework unifying trajectory prediction and occlusion reasoning under the same class of structured probabilistic generative model.
We then present some initial experiments illustrating its capabilities using the open dataset.
arXiv Detail & Related papers (2024-10-14T16:03:41Z) - HEADS-UP: Head-Mounted Egocentric Dataset for Trajectory Prediction in Blind Assistance Systems [47.37573198723305]
HEADS-UP is the first egocentric dataset collected from head-mounted cameras.
We propose a semi-local trajectory prediction approach to assess collision risks between blind individuals and pedestrians.
arXiv Detail & Related papers (2024-09-30T14:26:09Z) - Layout Sequence Prediction From Noisy Mobile Modality [53.49649231056857]
Trajectory prediction plays a vital role in understanding pedestrian movement for applications such as autonomous driving and robotics.
Current trajectory prediction models depend on long, complete, and accurately observed sequences from visual modalities.
We propose LTrajDiff, a novel approach that treats objects obstructed or out of sight as equally important as those with fully visible trajectories.
arXiv Detail & Related papers (2023-10-09T20:32:49Z) - Implicit Occupancy Flow Fields for Perception and Prediction in
Self-Driving [68.95178518732965]
A self-driving vehicle (SDV) must be able to perceive its surroundings and predict the future behavior of other traffic participants.
Existing works either perform object detection followed by trajectory of the detected objects, or predict dense occupancy and flow grids for the whole scene.
This motivates our unified approach to perception and future prediction that implicitly represents occupancy and flow over time with a single neural network.
arXiv Detail & Related papers (2023-08-02T23:39:24Z) - Trajectory Forecasting from Detection with Uncertainty-Aware Motion
Encoding [121.66374635092097]
Trajectories obtained from object detection and tracking are inevitably noisy.
We propose a trajectory predictor directly based on detection results without relying on explicitly formed trajectories.
arXiv Detail & Related papers (2022-02-03T09:09:56Z) - Uncertainty-Aware Vehicle Orientation Estimation for Joint
Detection-Prediction Models [12.56249869551208]
Orientation is an important property for downstream modules of an autonomous system.
We present a method that extends the existing models that perform joint object detection and motion prediction.
In addition, the approach is able to quantify prediction uncertainty, outputting the probability that the inferred orientation is flipped.
arXiv Detail & Related papers (2020-11-05T21:59:44Z) - Towards robust sensing for Autonomous Vehicles: An adversarial
perspective [82.83630604517249]
It is of primary importance that the resulting decisions are robust to perturbations.
Adversarial perturbations are purposefully crafted alterations of the environment or of the sensory measurements.
A careful evaluation of the vulnerabilities of their sensing system(s) is necessary in order to build and deploy safer systems.
arXiv Detail & Related papers (2020-07-14T05:25:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.