Performance Assessment of an Ultra-Tightly Coupled Vision-Aided INS/GNSS Navigation System

B. Priot, C. Peillon, V. Calmettes, M. Sahmoudi

Abstract: In this paper, we propose to design a multi-sensor navigation system for harsh GNSS environments, by deeply integrating a MEMS-based INS, GNSS signals and a vision system with a single camera for performing inertial sensors calibration. We process advantageously the images delivered by the camera to perform the calibration of INS sensors which is necessary for aiding efficiently GNSS receiver tracking loops, by providing a consistent estimate of acceleration in satellites-receiver lines of sight (LOS). The camera is used as an optical odometer to achieve INS calibration in a GNSS-denied environment, by tracking a set of points (i.e. landmarks) included in the images. The key idea of the proposed method relies on the way the different sensor measurements are combined and the covariance matrices are managed, by considering the prior knowledge we have on the system. A second contribution is the improvement of the image processing methodology. The adopted method, based on the SURF algorithm exploits a set of landmarks by relating their positions in the image to the platform position-attitude. First, we optimize the primary detection of landmarks when the navigation system faces problems under harsh GNSS environments. The direction and the distance of these landmarks are initialized and estimated over the tracking process. When new points are acquired, the state model is constructed by taking into account the knowledge we have on the vehicle position and attitude estimates at the acquisition epoch. Moreover, by exploiting the a priori covariance on the position and attitude estimates, taking benefit from the INS solution, the algorithm is able to bound the area of each landmark and to eliminate the outliers. Another important contribution of this study is the experimental performance assessment of the system, from a measurements campaign. The data was collected using a terrestrial vehicle, equipped with time-synchronized systems including: - A Synchronized Position Attitude Navigation system (SPAN) based on a DGPS receiver combined with a tactical grade IMU, used as a reference navigation platform; - A camera which collects the images from the main front direction. That camera is mounted with the front of the base towards the primary viewing direction. The recorded data is post-processed using a Matlab software of tools developed by the Navigation team of SCAN Lab at ISAE, Toulouse, France. GNSS signal is modeled considering 7 satellites in open environment. A MEMS based IMU is simulated by degrading the tactical grade IMU measurements. An important effort is devoted to use efficiently each sensor measurement (GNSS, IMU, Image sensor), and to assess the performance of this system under realistic conditions. Initially validated by simulations, the goal of this work is to ensure a coherent interaction and synchronization between each module of the system and to test its functionality from experimental data. The algorithm shows good performance according to the simulations and processed data. When the carrier to noise density ratio (C/N0) decreases to low levels the combined use of the MEMS-based INS and vision provides the vehicle attitude and position with a good accuracy. The main benefit of this system is that it allows us to perform inertial sensors calibration and consequently to facilitate weak GNSS signals tracking.
Published in: Proceedings of the 2011 International Technical Meeting of The Institute of Navigation
January 24 - 26, 2011
Catamaran Resort Hotel
San Diego, CA
Pages: 652 - 661
Cite this article: Priot, B., Peillon, C., Calmettes, V., Sahmoudi, M., "Performance Assessment of an Ultra-Tightly Coupled Vision-Aided INS/GNSS Navigation System," Proceedings of the 2011 International Technical Meeting of The Institute of Navigation, San Diego, CA, January 2011, pp. 652-661.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In