Previous Abstract Return to Session C2 Next Abstract

Session C2: Vision/Integrated Navigation Systems

Research on SINS/Vision Non-linear Navigation Algorithm with Gaussian Particle Filter
Yu Yong-jun, Zheng Kan, Zhang Xiang, Chen Shuai, Xu Rui, Nanjing University of Science and Technology (NJUST), China
Location: Windjammer

1. Introduction
In recent years, flying robotics has received significant attention from the robotics community. However?fully autonomous operation requires flying robotics to fly at low altitude or indoors where GPS signals are often shadowed. With advances in feature tracking, processing algorithms, and reduced cost/size of cameras, visual SLAM has received extensive investigation in recent decades as a tool for robotic navigation and mapping. Vehicle platforms equipped with cameras have been widely used in the areas of autonomous navigation in GPS-denied environments.
However, all visual SLAM methods share several key weaknesses.
First, one key step of visual SLAM using feature-based estimation is to estimate the relative camera pose between each frame pair. A series of “n-point” (n-correspondences) algorithms has been proposed for this objective. The 5-point algorithm is the most commonly used algorithm for a calibrated camera. However, 5-point algorithm is not always guaranteed to have a stable estimation. Furthermore, outliers detection and remove task using Random Sample Consensus(RANSAC) is very expensive from five points sets.
Second, visual SLAM systems are usually associated with low sample rates. This limits and hinders their usability in fast and real-time applications. Meanwhile, the current pose is computed by integrating all previous poses, which causes error growing without bound.
Third, combining visual and inertial measurements using such as Kalman Filter, EKF, UKF has long been a popular means for addressing common Robotics tasks such as visual odometry and SLAM. But considering the flying robotics is a highly nonlinear system, KF-Based methods can not effectively solve the problem of nonlinear filtering because of the dynamic characteristics of flying robotics and external environment.
In this paper, we propose an information fusion autonomous navigation algorithm based on strapdown inertial navigation system(SINS) and binocular visual(BV) navigation system. First, the 4-points algorithm using four feature point correspondences from an image pair and roll/pitch angles from SINS is deduced. Second, a new BV-SINS integrated localization algorithm using Gaussian Particle Filter(GPF) is presented, in which the position from binocular visual navigation system is set as the measurement information. The results of simulation show that the algorithm can signi?cantly improves the accuracy of attitude and pose estimates.
2. SINS/Vision Non-linear Navigation Algorithm Using 4-Points Algorithm and GPF
First, the scheme of BV-SINS integrated navigation system has been presented.
2.1 4-point Algorithm
In this Section, the formulation of the 4-point Algorithm using two known orientation angles from SINS is presented. We show that the relative camera pose can be estimated from only four image feature correspondences?
In our systems ,the roll and pitch are known. So, the total DoF is reduced from 5 to 3. Given four or more point correspondences, the pose estimation problem is expressed by an optimization problem. The optimization problem is essentially identical to the eigenvalue problem. However, the smallest eigenvalue and imaginary numbers is difficult to compute directly.
To avoid the eigenvalue computation, this paper proposes a new cost function. The determinant of a square matrix is equal to the product of all its eigenvalues. So, the optimization problem can be solved by the resultant based method, which is also known as the hidden variable method.
2.2 SINS/Vision Navigation Algorithm Using GPF
2.2.1 State Equation
The navigation frame is north-east-down local level reference frame. State variables are chosen as platform misalignment, velocity errors in north, east and earth direction; position errors, Markov process errors of gyro and Markov process errors of accelerometer.
The non-linear state equation of the SINS/Vision integrated navigation model is presented. The nonlinear model and filter arithmetic are designed.
2.2.2. Measurement Equation
The longitude, latitude and height errors between SINS and Vision are chosen as the measurement variables.
2.2.3. GPF Algorithm
The state equation of the SINS/Vision integrated navigation model is non-linear, but its measurement equation is linear. The GPF is simplified in which the time update using non-linear GPF, and measurement update using linear Kalman Filter.
2.3 Simulation
A car-based system using MTi-100 and FLIR Bumblebee2 Camera is designed. And the proposed algorithm is testified using KITTI datasets and car-based system.
Test Results indicate that the algorithm in the paper can improved the precision of navigation system effectively.
3. Conclusions
In this paper, The theory of SINS/Vision integrated navigation system has been proposed. This paper proposes a 4-points robust least squares solution with two orientation angles from SINS. The model of SINS/Vision information fusion is designed. Following this, the SINS/Vision integrated algorithm using GPF is presented. The SINS/Vision algorithm is of important value in engineering application.



Previous Abstract Return to Session C2 Next Abstract