Issue |
Wuhan Univ. J. Nat. Sci.
Volume 28, Number 1, February 2023
|
|
---|---|---|
Page(s) | 20 - 28 | |
DOI | https://doi.org/10.1051/wujns/2023281020 | |
Published online | 17 March 2023 |
Computer Science
CLC number: TP 391
RRVPE: A Robust and Real-Time Visual-Inertial-GNSS Pose Estimator for Aerial Robot Navigation
1
College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, Jiangsu, China
2
School of Mathematics & Physics, Anhui University of Technology, Maanshan 243000, Anhui, China
3
Electric Power Research Institute of Guizhou Power Grid Co., Ltd., Guiyang 550002, Guizhou, China
† To whom correspondence should be addressed. E-mail: YangZhong@nuaa.edu.cn
Received:
25
July
2022
Self-localization and orientation estimation are the essential capabilities for mobile robot navigation. In this article, a robust and real-time visual-inertial-GNSS(Global Navigation Satellite System) tightly coupled pose estimation (RRVPE) method for aerial robot navigation is presented. The aerial robot carries a front-facing stereo camera for self-localization and an RGB-D camera to generate 3D voxel map. Ulteriorly, a GNSS receiver is used to continuously provide pseudorange, Doppler frequency shift and universal time coordinated (UTC) pulse signals to the pose estimator. The proposed system leverages the Kanade Lucas algorithm to track Shi-Tomasi features in each video frame, and the local factor graph solution process is bounded in a circumscribed container, which can immensely abandon the computational complexity in nonlinear optimization procedure. The proposed robot pose estimator can achieve camera-rate (30 Hz) performance on the aerial robot companion computer. We thoroughly experimented the RRVPE system in both simulated and practical circumstances, and the results demonstrate dramatic advantages over the state-of-the-art robot pose estimators.
Key words: computer vision / visual-inertial-GNSS(Global Navigation Satellite System) pose estimation / real-time autonomous navigation / sensor fusion / robotics
Biography: ZHANG Chi, male, Ph.D. candidate, research direction: robot navigation. E-mail: laozhang@nuaa.edu.cn
Fundation item: Supported by the Guizhou Provincial Science and Technology Projects ([2020]2Y044), the Science and Technology Projects of China Southern Power Grid Co. Ltd. (066600KK52170074), and the National Natural Science Foundation of China (61473144)
© Wuhan University 2023
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.