Mars Rover Localization Based on A2G Obstacle Distribution Pattern
Matching
- URL: http://arxiv.org/abs/2210.03398v1
- Date: Fri, 7 Oct 2022 08:29:48 GMT
- Title: Mars Rover Localization Based on A2G Obstacle Distribution Pattern
Matching
- Authors: Lang Zhou (1), Zhitai Zhang (1), Hongliang Wang (1) ((1) College of
Surveying and Geo-Informatics, Tongji University)
- Abstract summary: In NASA's Mars 2020 mission, the Ingenuity helicopter is carried together with the rover.
Traditional image matching methods will struggle to obtain valid image correspondence.
An algorithm combing image-based rock detection and rock distribution pattern matching is used to acquire A2G imagery correspondence.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Rover localization is one of the perquisites for large scale rover
exploration. In NASA's Mars 2020 mission, the Ingenuity helicopter is carried
together with the rover, which is capable of obtaining high-resolution imagery
of Mars terrain, and it is possible to perform localization based on
aerial-to-ground (A2G) imagery correspondence. However, considering the
low-texture nature of the Mars terrain, and large perspective changes between
UAV and rover imagery, traditional image matching methods will struggle to
obtain valid image correspondence. In this paper we propose a novel pipeline
for Mars rover localization. An algorithm combing image-based rock detection
and rock distribution pattern matching is used to acquire A2G imagery
correspondence, thus establishing the rover position in a UAV-generated ground
map. Feasibility of this method is evaluated on sample data from a Mars
analogue environment. The proposed method can serve as a reliable assist in
future Mars missions.
Related papers
- Structure-Invariant Range-Visual-Inertial Odometry [17.47284320862407]
This work introduces a novel range-visual-inertial odometry system tailored for the Mars Science Helicopter mission.
Our system extends the state-of-the-art xVIO framework by fusing consistent range information with visual and inertial measurements.
We demonstrate that our range-VIO approach estimates terrain-relative velocity meeting the stringent mission requirements.
arXiv Detail & Related papers (2024-09-06T21:49:10Z) - Mapping "Brain Terrain" Regions on Mars using Deep Learning [0.0]
A set of critical areas may have seen cycles of ice thawing in the relatively recent past in response to periodic changes in the obliquity of Mars.
In this work, we use convolutional neural networks to detect surface regions containing "Brain Coral" terrain.
We use large images (100-1000 megapixels) from the Mars Reconnaissance Orbiter to search for these landforms at resolutions close to a few tens of centimeters per pixel.
arXiv Detail & Related papers (2023-11-21T02:24:52Z) - Boosting 3-DoF Ground-to-Satellite Camera Localization Accuracy via
Geometry-Guided Cross-View Transformer [66.82008165644892]
We propose a method to increase the accuracy of a ground camera's location and orientation by estimating the relative rotation and translation between the ground-level image and its matched/retrieved satellite image.
Experimental results demonstrate that our method significantly outperforms the state-of-the-art.
arXiv Detail & Related papers (2023-07-16T11:52:27Z) - MaRF: Representing Mars as Neural Radiance Fields [1.4680035572775534]
MaRF is a framework able to synthesize the Martian environment using several collections of images from rover cameras.
It addresses key challenges in planetary surface exploration such as: planetary geology, simulated navigation and shape analysis.
In the experimental section, we demonstrate the environments created from actual Mars datasets captured by Curiosity rover, Perseverance rover and Ingenuity helicopter.
arXiv Detail & Related papers (2022-12-03T18:58:00Z) - Satellite Image Based Cross-view Localization for Autonomous Vehicle [59.72040418584396]
This paper shows that by using an off-the-shelf high-definition satellite image as a ready-to-use map, we are able to achieve cross-view vehicle localization up to a satisfactory accuracy.
Our method is validated on KITTI and Ford Multi-AV Seasonal datasets as ground view and Google Maps as the satellite view.
arXiv Detail & Related papers (2022-07-27T13:16:39Z) - 6D Camera Relocalization in Visually Ambiguous Extreme Environments [79.68352435957266]
We propose a novel method to reliably estimate the pose of a camera given a sequence of images acquired in extreme environments such as deep seas or extraterrestrial terrains.
Our method achieves comparable performance with state-of-the-art methods on the indoor benchmark (7-Scenes dataset) using only 20% training data.
arXiv Detail & Related papers (2022-07-13T16:40:02Z) - CroCo: Cross-Modal Contrastive learning for localization of Earth
Observation data [62.96337162094726]
It is of interest to localize a ground-based LiDAR point cloud on remote sensing imagery.
We propose a contrastive learning-based method that trains on DEM and high-resolution optical imagery.
In the best scenario, the Top-1 score of 0.71 and Top-5 score of 0.81 are obtained.
arXiv Detail & Related papers (2022-04-14T15:55:00Z) - Towards Robust Monocular Visual Odometry for Flying Robots on Planetary
Missions [49.79068659889639]
Ingenuity, that just landed on Mars, will mark the beginning of a new era of exploration unhindered by traversability.
We present an advanced robust monocular odometry algorithm that uses efficient optical flow tracking.
We also present a novel approach to estimate the current risk of scale drift based on a principal component analysis of the relative translation information matrix.
arXiv Detail & Related papers (2021-09-12T12:52:20Z) - Rover Relocalization for Mars Sample Return by Virtual Template
Synthesis and Matching [48.0956967976633]
We consider the problem of rover relocalization in the context of the notional Mars Sample Return campaign.
In this campaign, a rover (R1) needs to be capable of autonomously navigating and localizing itself within an area of approximately 50 x 50 m.
We propose a visual localizer that exhibits robustness to the relatively barren terrain that we expect to find in relevant areas.
arXiv Detail & Related papers (2021-03-05T00:18:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.