Enhancing Accuracy in Visual SLAM by Tightly Coupling Sparse Ranging Measurements Between Two Rovers

Chen Zhu, Gabriele Giorgi, Young-Hee Lee, Christoph Günther

Abstract: Compared with stand-alone rovers, cooperative swarms of robots equipped with cameras enable a more efficient exploration of the environment, and are more robust against malfunctions of an individual platform. VSLAM (Visual Simultaneous Localization and Mapping) techniques have been developed in recent years to estimate the trajectory of vehicles and to simultaneously reconstruct the map of the surroundings using visual clues. This work proposes a tight coupling sensor fusion approach based on the combined use of stereo cameras and sparse ranging measurements between two dynamic rovers in planar motion. The Cramer-Rao lower bound (CRLB) of the ´ rover pose estimator using the fusion algorithm is calculated. Both the lower bound and the simulation results show that to what extent the proposed fusion method outperforms the vision-only approach.
Published in: 2018 IEEE/ION Position, Location and Navigation Symposium (PLANS)
April 23 - 26, 2018
Hyatt Regency Hotel
Monterey, CA
Pages: 440 - 446
Cite this article: Zhu, Chen, Giorgi, Gabriele, Lee, Young-Hee, Günther, Christoph, "Enhancing Accuracy in Visual SLAM by Tightly Coupling Sparse Ranging Measurements Between Two Rovers," 2018 IEEE/ION Position, Location and Navigation Symposium (PLANS), Monterey, CA, April 2018, pp. 440-446.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In