Fusion of 3D GIS, Vision, Inertial and Magnetic Data for Improved Urban Pedestrian Navigation and Augmented Reality Applications

Nicolas Antigny, Myriam Servières, Valérie Renaudin

Peer Reviewed

Abstract: In the context of pedestrian navigation and Augmented Reality applications in urban environments, we propose to fuse the pose estimated through a vision process, thanks to a precisely known 3D model, with inertial and magnetic measurements. First, this allows for updating a Pedestrian Dead-Reckoning process and improving the positioning accuracy. Second, a trusted pose estimate allows us to reproject 3D Geographical Information System content in Augmented Reality with qualified confidence. Because 3D Geographical Information System data are provided by many sources inducing an in-homogeneous precision and level of quality, being able to qualify these 3D contents is important to validate their relevance to use them. A long pedestrian path of 3 km in an urban environment with a sparsely known 3D model of urban furniture was conducted. This has permitted validation of the contribution of sensor fusion that improves the positioning accuracy and allows characterization of the 3D Geographical Information System content directly on-site using Augmented Reality. Performance is presented in terms of positioning accuracy in urban spaces.
Published in: NAVIGATION: Journal of the Institute of Navigation, Volume 65, Number 3
Pages: 431 - 447
Cite this article: Export Citation
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In