Abstract: | With the rapid development and popularity of smartphones and mobile internet, there has been an explosive growth of concern in precise location-based services, such as public safety, emergency rescue, internet of things, smart city construction, etc. In outdoor environments, these services have been underpinned and provided by the Global Navigation Satellite System (GNSS). However, indoor environment is the main scenario for pedestrians, where GNSS is significantly degraded or unavailable due to the signal fading and multipath effect. To accomplish positioning and navigation in indoor areas, various technologies have been broadly concerned and researched. This paper presents a visual-inertial localization algorithm with off-the-shelf sensors in smartphones. This method takes full advantage of visual and inertial measurements to get a high-precision positioning result. Firstly, the vision part adopts a sparse direct algorithm to estimate the 6 degree of freedom ego-motion by minimizing the photometric error, and this method is different from most standard visual odometry methods that are based on image features and minimize the reprojection error. Then, we use an EKF framework to integrate the inertial information into the estimate process to improve the stability of the localization system and the visual scale also isrecovered using inertial information. Finally, to verify the performance of proposed visual-inertial navigation algorithm, several indoor scene tests with smartphones were conducted. The results showed that the proposed method can achieve comparable position accuracy to state-of-art of indoor pedestrian navigation technologies and validate the feasibility of high-precision indoor navigation using the low-cost inertial sensors and rolling shutter camera embedded in the smartphones. |
Published in: |
Proceedings of the 30th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2017) September 25 - 29, 2017 Oregon Convention Center Portland, Oregon |
Pages: | 3311 - 3320 |
Cite this article: |
Wang, Zhaosheng, Qian, Jiuchao, Wang, Yuze, Liu, Peinlin, Yu, Wenxian, "A Sparse Direct Visual-Inertial Method for Pedestrian Navigation Using Smartphone Sensors," Proceedings of the 30th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2017), Portland, Oregon, September 2017, pp. 3311-3320.
https://doi.org/10.33012/2017.15239 |
Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |