Abstract: | The problem of obtaining a long-term accurate positioning solution in indoor environments has been an emergent topic in the world of business and academia in recent years. In the absence of information from the Global Positioning System (GPS), the inertial sensors within a smartphone can be used to provide a relative navigation solution. However, these onboard Micro Electro Mechanical Systems (MEMS) based sensors suffer from the effects of different sensor errors which cause the inertial-only solution to deteriorate rapidly. As such, there is a need to constrain the inertial positioning solution when long-term navigation is needed. GPS positions and velocities, and WiFi positions are the most important forms of updates available for the inertial solution. However, updates from these two sources depend on external signals and infrastructure that may not always be available. Another problem specific to smartphone based navigation stems from the fact that a user does not use the device in one specific or unchanging orientation and the device is free to change orientation with respect to both the user and the direction of motion. To overcome these limitations, researchers are looking at other means of providing constraints to the inertial-only solution. One commercial product that attempts to fill the niche of a truly portable, low-cost, long-term accurate indoor positioning solution is the InvenSense Positioning App (IPA) developed by InvenSense Canada. The IPA is an inertial-based system running on smartphones that provides a 3D navigation solution even in the absence of information from the GPS. The IPA uses proprietary and patented algorithms to estimate the misalignment of the device with respect to the moving platform, making it agnostic to any specific device orientation. A rich source of information about the outside world can be obtained using the device’s camera. Nearly all devices have at least one camera which has thus far been largely neglected as a navigation aid. Parameters extracted from the stream of images are used to aid several modules within the IPA. The vision aiding module performs context classification to provide device angle with respect to the platform, height change information in different scenarios, as well as static period and fidgeting detection. This information is used by the IPA in the form of misalignment, external height, and zero velocity updates. The results of the integration of the vision aiding module with the IPA show significant improvement in many cases. This work is patent pending. |
Published in: |
Proceedings of the 27th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2014) September 8 - 12, 2014 Tampa Convention Center Tampa, Florida |
Pages: | 2118 - 2131 |
Cite this article: | Kazemipur, Bashir, Syed, Zainab, Georgy, Jacques, El-Sheimy, Naser, "Real-time Vision-aiding for Reliable 3D Indoor Location," Proceedings of the 27th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2014), Tampa, Florida, September 2014, pp. 2118-2131. |
Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |