Monocular Vision Localization Using a Gimbaled Laser Range Sensor

D. J. Yates, M.J. Veth

Abstract: There have been great advances in recent years in the area of indoor navigation. Many of these new navigation systems rely on digital images to aid an inertial navigation estimates. The Air Force Institute of Technology (AFIT) has been conducting research in this area for a number of years. The image-aiding techniques are centered on tracking stationary features in order to improve inertial navigation estimates. Previous research has used stereo vision systems or terrain constraints with monocular systems to estimate feature locations. While these methods have shown good results, they do have drawbacks. First, as unmanned exploration vehicles become smaller in size the distance available to create a baseline between two cameras decreases resulting in a decrease of distancing accuracy. Second, if using a monocular system, terrain data might not be known in an unexplored environment. This research explores the use of a small gimbaled laser range sensor and monocular camera to estimate feature locations which are then used to add a navigation filter. The gimbaled system consists of a commercial off-the-shelf range sensor, a pair of lightweight servos, and a microcontroller that accepts azimuth and elevation commands. This system is approximately 15x8x12 cm and weighs less than 200 grams. This novel approach, called laser-aided image inertial navigation, provides precise depth measurements to key features. The locations of these key features are then calculated based on the current state estimates of an extended Kalman filter. This method of estimating feature locations is tested both by simulation and real world imagery. Navigation experiments are presented which compare this method with previous image-aided filters. While only a limited number of tests were conducted, simulated and real world flight tests show that the monocular laser-aided filter can accurately estimate the trajectory of a vehicle to within a few tenths of a meter. This is done without terrain constraints or any prior knowledge of the operational area.
Published in: Proceedings of the 23rd International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS 2010)
September 21 - 24, 2010
Oregon Convention Center, Portland, Oregon
Portland, OR
Pages: 2251 - 2261
Cite this article: Yates, D. J., Veth, M.J., "Monocular Vision Localization Using a Gimbaled Laser Range Sensor," Proceedings of the 23rd International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS 2010), Portland, OR, September 2010, pp. 2251-2261.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In