Abstract: | Navigation using IMU and GPS sensors is well documented. Numerous papers further describe the use of imagery to assist navigation. Other papers describe using GPS and IMU for aiding preparation of georegistered imagery. Finally, there are various procedures that use only planar imagery to recreate 3D rendered scenes that can be viewed from any geometry. All of these methods mix and match sensor data to prepare some means of generating a user spatial awareness through the computation of position and attitude (pose). This paper presents a unified theory for merging these spatial sensing modalities based upon development of a Markov process representation of the pose evolution. This approach is a direct extension of IMU and GPS navigation solutions that define a transition matrix between observations related to pose. This paper describes how overlapped image sequences can be cast into a similar stochastic Markov representation that merges all 3D information from camera sensors. A sequence of image frames forms a Markov process for evolving pose that is easily merged with traditional IMU and GPS stochastic models. Out-of-sequence overlapped images act as observations to the underlying Markov process so that the complete overlapped image set is integrated in an optimal manner. Kalman filter/smoother procedures are applied so that the navigation and scene geospatial content are generated conditioned on all IMU, GPS, and image measurements. As a side benefit, the statistical accuracy of all geometry information is provided. This paper will demonstrate this new approach for aircraft platforms that include a digital camera, GPS, and a tactical grade MEMS IMU that is rigidly affixed to the camera. Examples of pose computation that use time-synchronized measurements from these sensors singularly and together will be presented. All processing will be handled by a single recursive algorithm approach with selectable parameters to de-weight the various measurements. Covariance analysis will be used to suggest relative benefits of the contributing sensors. Datasets will be taken from both parallel-swath mapping missions and encircling missions. |
Published in: |
Proceedings of the 26th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2013) September 16 - 20, 2013 Nashville Convention Center, Nashville, Tennessee Nashville, TN |
Pages: | 782 - 791 |
Cite this article: | Kain, J.E, Summerville, J., Summerville, J., "Integration of Vision and Navigation," Proceedings of the 26th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2013), Nashville, TN, September 2013, pp. 782-791. |
Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |