Event Camera-based Visual Odometry for Dynamic Motion Tracking of a
Legged Robot Using Adaptive Time Surface
- URL: http://arxiv.org/abs/2305.08962v1
- Date: Mon, 15 May 2023 19:03:45 GMT
- Title: Event Camera-based Visual Odometry for Dynamic Motion Tracking of a
Legged Robot Using Adaptive Time Surface
- Authors: Shifan Zhu, Zhipeng Tang, Michael Yang, Erik Learned-Miller, Donghyun
Kim
- Abstract summary: Event cameras offer high temporal resolution and dynamic range, which can eliminate the issue of blurred RGB images during fast movements.
We introduce an adaptive time surface (ATS) method that addresses the whiteout and blackout issue in conventional time surfaces.
Lastly, we propose a nonlinear pose optimization formula that simultaneously performs 3D-2D alignment on both RGB-based and event-based maps and images.
- Score: 5.341864681049579
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Our paper proposes a direct sparse visual odometry method that combines event
and RGB-D data to estimate the pose of agile-legged robots during dynamic
locomotion and acrobatic behaviors. Event cameras offer high temporal
resolution and dynamic range, which can eliminate the issue of blurred RGB
images during fast movements. This unique strength holds a potential for
accurate pose estimation of agile-legged robots, which has been a challenging
problem to tackle. Our framework leverages the benefits of both RGB-D and event
cameras to achieve robust and accurate pose estimation, even during dynamic
maneuvers such as jumping and landing a quadruped robot, the Mini-Cheetah. Our
major contributions are threefold: Firstly, we introduce an adaptive time
surface (ATS) method that addresses the whiteout and blackout issue in
conventional time surfaces by formulating pixel-wise decay rates based on scene
complexity and motion speed. Secondly, we develop an effective pixel selection
method that directly samples from event data and applies sample filtering
through ATS, enabling us to pick pixels on distinct features. Lastly, we
propose a nonlinear pose optimization formula that simultaneously performs
3D-2D alignment on both RGB-based and event-based maps and images, allowing the
algorithm to fully exploit the benefits of both data streams. We extensively
evaluate the performance of our framework on both public datasets and our own
quadruped robot dataset, demonstrating its effectiveness in accurately
estimating the pose of agile robots during dynamic movements.
Related papers
- Redundancy-Aware Camera Selection for Indoor Scene Neural Rendering [54.468355408388675]
We build a similarity matrix that incorporates both the spatial diversity of the cameras and the semantic variation of the images.
We apply a diversity-based sampling algorithm to optimize the camera selection.
We also develop a new dataset, IndoorTraj, which includes long and complex camera movements captured by humans in virtual indoor environments.
arXiv Detail & Related papers (2024-09-11T08:36:49Z) - Line-based 6-DoF Object Pose Estimation and Tracking With an Event Camera [19.204896246140155]
Event cameras possess remarkable attributes such as high dynamic range, low latency, and resilience against motion blur.
We propose a line-based robust pose estimation and tracking method for planar or non-planar objects using an event camera.
arXiv Detail & Related papers (2024-08-06T14:36:43Z) - Camera Motion Estimation from RGB-D-Inertial Scene Flow [9.192660643226372]
We introduce a novel formulation for camera motion estimation that integrates RGB-D images and inertial data through scene flow.
Our goal is to accurately estimate the camera motion in a rigid 3D environment, along with the state of the inertial measurement unit (IMU)
arXiv Detail & Related papers (2024-04-26T08:42:59Z) - DynaMoN: Motion-Aware Fast and Robust Camera Localization for Dynamic Neural Radiance Fields [71.94156412354054]
We propose Dynamic Motion-Aware Fast and Robust Camera Localization for Dynamic Neural Radiance Fields (DynaMoN)
DynaMoN handles dynamic content for initial camera pose estimation and statics-focused ray sampling for fast and accurate novel-view synthesis.
We extensively evaluate our approach on two real-world dynamic datasets, the TUM RGB-D dataset and the BONN RGB-D Dynamic dataset.
arXiv Detail & Related papers (2023-09-16T08:46:59Z) - EventTransAct: A video transformer-based framework for Event-camera
based action recognition [52.537021302246664]
Event cameras offer new opportunities compared to standard action recognition in RGB videos.
In this study, we employ a computationally efficient model, namely the video transformer network (VTN), which initially acquires spatial embeddings per event-frame.
In order to better adopt the VTN for the sparse and fine-grained nature of event data, we design Event-Contrastive Loss ($mathcalL_EC$) and event-specific augmentations.
arXiv Detail & Related papers (2023-08-25T23:51:07Z) - Self-Supervised Scene Dynamic Recovery from Rolling Shutter Images and
Events [63.984927609545856]
Event-based Inter/intra-frame Compensator (E-IC) is proposed to predict the per-pixel dynamic between arbitrary time intervals.
We show that the proposed method achieves state-of-the-art and shows remarkable performance for event-based RS2GS inversion in real-world scenarios.
arXiv Detail & Related papers (2023-04-14T05:30:02Z) - PUCK: Parallel Surface and Convolution-kernel Tracking for Event-Based
Cameras [4.110120522045467]
Event-cameras can guarantee fast visual sensing in dynamic environments, but require a tracking algorithm that can keep up with the high data rate induced by the robot ego-motion.
We introduce a novel tracking method that leverages the Exponential Reduced Ordinal Surface (EROS) data representation to decouple event-by-event processing and tracking.
We propose the task of tracking the air hockey puck sliding on a surface, with the future aim of controlling the iCub robot to reach the target precisely and on time.
arXiv Detail & Related papers (2022-05-16T13:23:52Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Motion-from-Blur: 3D Shape and Motion Estimation of Motion-blurred
Objects in Videos [115.71874459429381]
We propose a method for jointly estimating the 3D motion, 3D shape, and appearance of highly motion-blurred objects from a video.
Experiments on benchmark datasets demonstrate that our method outperforms previous methods for fast moving object deblurring and 3D reconstruction.
arXiv Detail & Related papers (2021-11-29T11:25:14Z) - Event-based Stereo Visual Odometry [42.77238738150496]
We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig.
We seek to maximize thetemporal consistency of stereo event-based data while using a simple and efficient representation.
arXiv Detail & Related papers (2020-07-30T15:53:28Z) - Exploiting Event Cameras for Spatio-Temporal Prediction of Fast-Changing
Trajectories [7.13400854198045]
This paper investigates trajectory prediction for robotics, to improve the interaction of robots with moving targets.
We apply state of the art machine learning, specifically based on Long-Short Term Memory (LSTM) architectures.
arXiv Detail & Related papers (2020-01-05T14:37:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.