Title: Analysis of a Navigation System based on Partially Tight Integration of IMU -Visual Odometry with Loosely Coupled GPS
Author(s): M. Sahmoudi, N. Ramuni
Published in: Proceedings of the 29th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS+ 2016)
September 12 - 16, 2016
Oregon Convention Center
Portland, Oregon
Pages: 1322 - 1329
Cite this article: Sahmoudi, M., Ramuni, N., "Analysis of a Navigation System based on Partially Tight Integration of IMU -Visual Odometry with Loosely Coupled GPS," Proceedings of the 29th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS+ 2016), Portland, Oregon, September 2016, pp. 1322-1329.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In
Abstract: Reliable estimation of the pose is essential to make a UAV autonomous and practical in real life applications, such as surveying of infrastructure, safety and rescue, agriculture, etc. An accurate navigation capability is required that is robust and not solely dependent on GNSS. The goal of our proposed method is to implement a multi-sensor fusion architecture on the drone using low-cost GNSS receiver, a low-cost MEMS IMU and a mono camera. The paper presents a partial tight fusion architecture of IMU with the pose from monocular Visual odometry (VO) using an extended Kalman filter (EKF). We impose the state transition from the IMU and coupled with linear measurements from the VO. We further propose a loosely coupled architecture for the fusion of estimated pose from partial tight fusion, with the GPS using a linear Kalman filter. Compared with the traditional way of fusing Inertial-Vision or Inertial –GPS, the proposed method is a balance of both the methods as it involves two step Kalman filter and capable of reliable estimation of pose in GNSS harsh environment (deep urban and indoor) and Vision outages and thereby reducing the drift of the IMU. The adopted method of VO does not need to estimate explicitly the features 3D coordinates. Alternatively, we exploit the epipolar geometry and trifocal tensor between each three successive camera views as geometric constraints to be injected in the measurements model of a linear Kalman filter. This formulation is implemented over a sliding window to avoid a full SLAM with high complexity.