Title: Stereo Vision-Based Simultaneous Localization and Mapping with Ranging Aid
Author(s): Young-Hee Lee, Chen Zhu, Gabriele Giorgi, Christoph Guenther
Published in: Proceedings of IEEE/ION PLANS 2018
April 23 - 26, 2018
Hyatt Regency Hotel
Monterey, CA
Pages: 404 - 409
Cite this article: Lee, Young-Hee, Zhu, Chen, Giorgi, Gabriele, Guenther, Christoph, "Stereo Vision-Based Simultaneous Localization and Mapping with Ranging Aid," Proceedings of IEEE/ION PLANS 2018, Monterey, CA, April 2018, pp. 404-409.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In
Abstract: We propose fusion of stereo visual odometry and ranging for Simultaneous Localization And Mapping (SLAM). Two basic processes of feature- and keyframe-based stereo visual odometry (Tracking and Local mapping) are operated in the algorithm, while saving all local keyframes, map points, and visual constraints in a global map database, as well as the available ranging constraints of the keyframes. At the end of the sequence, state estimates by visual odometry are fused with ranging measurements to mitigate the inherent accumulating errors in the process and achieve global consistency. We formulate a simple graphical representation for the fusion, and perform least squares estimation with the sparse Levenberg-Marquardt algorithm to minimize the summation of the re-projection and distance squared errors over all the defined constraints in the global graph. The proposed algorithm is evaluated both qualitatively and quantitatively on a real stereo image dataset with synthetically generated distance measurements with superimposed Gaussian white noise. The experimental results show that the proposed SLAM algorithm effectively compensates the cumulative bias in visual odometry. Furthermore, the global accuracy of the trajectory estimation is comparable to the one of stereo vision-only SLAM with closing loops.