Joint GPS and Vision Direct Position Estimation

Yuting Ng and Grace Xingxin Gao

Abstract: GPS and vision position sensing is complementary. In open sky environments, GPS sensing is superior, with strong signal reception, while vision sensing is degraded due to lack of unique features. In urban environments, GPS sensing is degraded due to obstruction and multipath, while vision sensing is superior, with many unique urban features. To better leverage upon the complementary nature of the two sensing modes, we propose the Joint GPS and Vision Direct Positioning (GPS+V DP). GPS+V DP achieves meaningful integration between the two sensing modes, directly estimating the receiver position from the entire raw GPS signal and vision features extracted from the raw image. GPS+V DP consists of two synchronous lines of processing: GPS DP and Vision DP. GPS DP searches for the composite signal replica that gives the highest correlation against the observed GPS signal. This best matched composite signal replica is most likely generated from the most optimal receiver parameters of 3D position, clock bias, 3D velocity and clock drift. Vision DP searches for the geo-tagged reference image that gives the lowest composite feature distance against the observed image. This best matched image is most likely generated from the most optimal receiver 3D position and attitude. The measurements from both GPS DP and Vision DP is concatenated and used to directly estimate and track the receiver parameters. We implemented GPS+V DP using our research platform (PyGNSS) and an open source computer vision library (OpenCV). We tested our GPS+V DP receiver architecture on experimental data collected on campus. We demonstrate the functionality of our algorithm through our experimental results.
Published in: Proceedings of IEEE/ION PLANS 2016
April 11 - 14, 2016
Hyatt Regency Hotel
Savannah, GA
Pages: 380 - 385
Cite this article: Ng, Yuting, Gao, Grace Xingxin, "Joint GPS and Vision Direct Position Estimation," Proceedings of IEEE/ION PLANS 2016, Savannah, GA, April 2016, pp. 380-385.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In