Real-time Implementation of Visual-aided Inertial Navigation Using Epipolar Constraints

J-O. Nilsson, D. Zachariah, M. Jansson, and P. Händel

Abstract: A real-time implementation and the related theory of a visual-aided inertial navigation system are presented. The entire system runs on a standard laptop with off-the-shelf sensory equipment connected via standard interfaces. The visual-aiding is based on epipolar constraints derived from a finite visual memory. The navigational states are estimated with a square root sigma-point Kalman filter. An adaptive visual memory based on statistical coupling is presented and used to store and discard images selectively. Timing and temporal ordering of sensory data are estimated recursively. The computational cost and complexity of the system is described, and the implementation is discussed in terms of code structure, external libraries, and important parameters. Finally, limited performance evaluation results of the system are presented.
Published in: Proceedings of IEEE/ION PLANS 2012
April 24 - 26, 2012
Myrtle Beach Marriott Resort & Spa
Myrtle Beach, South Carolina
Pages: 711 - 718
Cite this article: Nilsson, J-O., Zachariah, D., Jansson, M., Händel, P., "Real-time Implementation of Visual-aided Inertial Navigation Using Epipolar Constraints," Proceedings of IEEE/ION PLANS 2012, Myrtle Beach, South Carolina , April 2012, pp. 711-718. https://doi.org/10.1109/PLANS.2012.6236948
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In