Quantifying Feature Association Error in Camera-based Positioning

Chen Zhu, Mathieu Joerger, Michael Meurer

Peer Reviewed

Abstract: Abstract— Camera-based visual navigation techniques can provide six degrees-of-freedom estimates of position and orientation (or pose), and can be implemented at low cost in applications including autonomous driving, indoor positioning, and drone landing. However, feature matching errors may occur when associating measured features in camera images with mapped features in a landmark database, especially when repetitive patterns are in view. A typical example of repetitive patterns is that of regularly spaced windows on building walls. Quantifying the data association risk and its impact on navigation system integrity is essential in safety critical applications. But, literature on vision-based navigation integrity is sparse. This work aims at quantifying and bounding the integrity risk caused by incorrect associations in visual navigation using extended Kalman filters.
Published in: 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS)
April 20 - 23, 2020
Hilton Portland Downtown
Portland, Oregon
Pages: 967 - 972
Cite this article: Zhu, Chen, Joerger, Mathieu, Meurer, Michael, "Quantifying Feature Association Error in Camera-based Positioning," 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS), Portland, Oregon, April 2020, pp. 967-972. https://doi.org/10.1109/PLANS46316.2020.9109919
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In