Vision-based Real-time Estimation of Smartphone Heading and Misalignment

B. Kazemipur, Z. Syed, J. Georgy, N. El-Sheimy

Abstract: Modern mobile phones are powerful devices that are equipped with an array of technologies of interest to the world of pedestrian navigation. On-board inertial sensors allow phones to be used for navigation in areas with limited or no GPS signal availability. An inherent problem with inertial navigation in the absence of GPS signals, i.e., in indoor environments, is that the low-cost MEMS sensors used in calculating users position suffer from errors which accumulate over time. These accumulated errors result in a drift of the navigation solution with respect to time even with appropriate error modeling. Though this drift exists in all scenarios where MEMS sensors are used for navigation, it makes the navigation solution completely useless for long periods of indoor navigation without external updates. Another problem associated with smartphone based navigation is due to the fact that a user does not keep a phone precisely fixed in a specific orientation. As a user moves around their environment and interacts with the device, the device orientation and user orientation are free to change with respect to one another. Common examples of changing device orientation include those when texting, on ear, in pocket, on a belt, etc. The estimation of the device heading and the user heading especially when the device heading is changing with respect to the user’s heading is a challenging task and the effects of any errors here can significantly reduce the accuracy of the navigation solution. Traditionally, accelerometers, gyroscopes and magnetometers have been used to estimate the heading to generate a navigation solution. Nowadays nearly all phones are equipped with cameras, some being forward facing, some rear facing, and many possessing both. As a result of this seemingly ubiquitous feature, there has been increasing research into vision aided inertial navigation systems. One promising area of research is the investigation of whether image/video processing can be used to aid heading. So far there has been no work done on the estimation of misalignment using vision-based techniques, which will be a major part of this paper. This paper presents a novel method of estimating a device’s absolute heading and the heading misalignment between the device and person using time synchronized images. The algorithm relies on edge detection and the calculation of vanishing points and lines in successive images. The algorithm is implemented on Android smartphones where the orientation information was provided in real time as updates to the Trusted Portable Navigator (T-PN) Free Motion developed by Trusted Positioning Inc. The T-PN provides a navigation solution regardless of a change in the phone’s orientation by calculating these orientation angles with patented algorithms in real-time and using them as corrections for the user's attitude and position. Different image processing techniques are used for different scenarios. For scenarios in which the phone is held in landscape/portrait mode, the vanishing point method will be implemented to detect absolute heading changes. Additionally, the heading change estimation in different modes, such as portrait to landscape, will translate to different misalignments with respect to the fixed axes of the camera. For other scenarios, images from the front and/or rear facing camera will be examined for prominent edges and other features that can be tracked as the user moves. For example, walking through a hallway with the edge of the wall and floor or wall and ceiling visible (e.g. when the phone is in an “on ear” orientation) should yield a series of images with a prominent edge at a fixed angle. The orientation of these lines with respect to the fixed camera axis is an indication of the phone’s orientation. This can be used as the misalignment of the device with respect to the moving platform. This is one example but similar techniques will be used to determine phone orientation in common scenarios. The detected misalignments and any estimated values of absolute heading are then fed into the T-PN Free Motion as external updates. The results are presented for the following cases: (1) phone in portrait/landscape (“texting mode”), (2) phone on ear (“calling mode”) and (3) phone attached to a belt clip. For each of these use cases, misalignment estimation is performed while walking indoors in both wide and narrow hallways. In calling mode, the phone is kept in an orientation where neither floor-wall nor wall-ceiling edges are visible, tilted so that the floor-wall edge is visible and the tests are repeated with the wall-ceiling edge visible. The belt case will be simply showing the results of vision aiding when the phone is vertically attached to a belt clip. All cases are assessed in dim and bright lighting conditions to show the robustness of the proposed technique to such varying conditions. Additionally, open areas (with no clearly definable edges) are included as part of the testing procedure so as to determine the conditions in which vision-based misalignment estimation fails. The performance of the misalignment estimation module is demonstrated by comparing the performance of the navigation solution using orientation and misalignment updates from the presented vision-based technique with a navigation solution without these updates. The latter is provided by the original T-PN code, while the former is provided by vision-based techniques using the smartphone cameras to update T-PN. The device used in real time testing is the Samsung Galaxy SIII running Android 4.0. The Samsung Galaxy SIII features tri-axial gyroscopes, tri-axial accelerometers, tri-axial magnetometers, barometer, GNSS chipset, as well as forward-facing and rear-facing cameras. The paper concludes by providing the comparisons between the T-PN only heading and misalignment estimation as compared to the heading and misalignment estimation using vision sensors updating T-PN.
Published in: Proceedings of the 26th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2013)
September 16 - 20, 2013
Nashville Convention Center, Nashville, Tennessee
Nashville, TN
Pages: 505 - 510
Cite this article: Kazemipur, B., Syed, Z., Georgy, J., El-Sheimy, N., "Vision-based Real-time Estimation of Smartphone Heading and Misalignment," Proceedings of the 26th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2013), Nashville, TN, September 2013, pp. 505-510.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In