Abstract: | The foot mounted inertial measurement unit (IMU) has been used for inertial-based pedestrian navigation in recent literatures, where zero velocity updates (ZUPT) are applied each time the user takes a step. As only the velocity error is directly observed through ZUPT, the position error still accumulates with time, which limits the wide use of this method. According to previous research, the position error exceeds 10 m within about 10 minutes. In this research, a floor plan-based vision aiding method is proposed, from which the position update can be derived, to correct the position errors in the inertial-based pedestrian navigation. The proposed algorithm is designed to integrate two-dimensional camera image with the geodetic position and real scale of floor plan to derive three-dimensional camera position and orientation, hence to limit the accumulative error in inertial-based pedestrian navigation. The indoor field test results indicate that the pedestrian position solution is significantly improved with the vision aiding and the maximum error is reduced from over 15m to less than 4m in about 14 minutes. |
Published in: |
Proceedings of the ION 2013 Pacific PNT Meeting April 23 - 25, 2013 Marriott Waikiki Beach Resort & Spa Honolulu, Hawaii |
Pages: | 526 - 531 |
Cite this article: | Du, S., Huang, B., Gao, Y., "Integration of Floor Plan, Vision and Inertial Sensors for Pedestrian Navigation in Indoor Environments," Proceedings of the ION 2013 Pacific PNT Meeting, Honolulu, Hawaii, April 2013, pp. 526-531. |
Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |