|Abstract:||Micro Aerial Vehicles (MAV)’s applications on reconnaissance, mapping and even recreation have a flourishing blossom these years, which give rise to the requirement of sense accuracy. To overcome the limitation on accuracy and avoid the degradation during GPS signal dropouts due to the environment (indoor situations for example), innovative types of sensors and data fusion strategies and techniques are involved. Whilst one of the most commonly used strategies is to blend optical flow camera and GPS/IMU integration with extended Kalman filter (EKF), the vision-based methods like vision SLAM become prevalent. Then the clash on compatibility for different category of sensors with distinct format and report rate can be the main drawback for current fusion algorithm. In this paper, we propose a graphical fusion approach that is able to estimate the optimal performance from all the sensors in a more smooth and compatible way. Enlightened by graph-based Fig. 1: Our quad-router MAV platform with 9-axis IMU, barometer, ultrasound, GPS, and optical flow sensors. The control of the drone is collaborated by the Pixhawk, an open source flight stack and the oDroid, a card-size computer with Samsung Exynos5422 processor. SLAM, a novel structure of factor graph is introduced to encode sensor measurement. Experimental results demonstrate the performance of the graph-based fusion compared with two other classic methods.|
Proceedings of the 29th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2016)
September 12 - 16, 2016
Oregon Convention Center
|Pages:||1485 - 1491|
|Cite this article:||
Gong, Zheng, Pei, Ling, Zou, Danping, Miao, Ruihang, Liu, Peilin, Yu, Wenxian, "Graphical Approach for MAV Sensors Fusion," Proceedings of the 29th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2016), Portland, Oregon, September 2016, pp. 1485-1491.
ION Members/Non-Members: 1 Download Credit