Chameleon v2:Improved Imaging-inertial Indoor Navigation

J. Rydell, E. Emilsson

Abstract: We present soldier and first responder positioning based on fusion of inertial measurements, visual stereoscopic images and thermal infrared stereoscopic images. We also present loop-closure based on automatically generated 3D point-cloud models of visited environments. Results from experiments in relevant scenarios and environments will also be included. Today, soldiers, firefighters and other first responders use maps together with GPS (Global Positioning System) receivers to localize vehicles and dismounted personnel. However, in urban environments, large positioning errors may be obtained due to multipath and attenuation of the GPS signals. The availability of GPS signals is even worse indoors, but some GPS receivers still tend to deliver position estimates even though these estimates are incorrect. Therefore, an alternative positioning solution is preferred during indoor operations and in urban canyons. We aim to find techniques for positioning in environments where GPS is unreliable. We have developed a SLAM (simultaneous localization and mapping) system called CHAMELEON. The system consists of visual stereoscopic imaging sensors, which are collocated with an IMU (inertial measurement unit). By using stereoscopic cameras, 3D point cloud models of, e.g., visited buildings can be created. Projecting these point clouds on the horizontal plane provides overview maps. We have also developed and evaluated sensor fusion algorithms for integrating foot-mounted IMUs with imaging sensors. Results show that sensor fusion improves the navigation solution considerably in scenarios where either the foot-mounted or the camera-based system is unable to navigate on its own. More details about the systems are available in [1,2]. In both soldier and first responder applications, there are requirements on the positioning accuracy and the size, weight, and power usage of the system. In most firefighter scenarios, room-level accuracy would be acceptable. The system must not slow down the operation by adding too much extra weight or size to the existing equipment. Nevertheless it must perform during the entire operation. A positioning system for firefighters must also be able to handle dark and smoke-filled environments. Visual cameras can handle variations in lighting, but can not see through smoke nor be used to navigate in darkness. Therefore we have recently added thermal infrared cameras to the system. These cameras, however, do not perform well in environments with very small temperature variations. To handle a broad range of situations where navigation is difficult with only one sensor, we use both visual and thermal infrared cameras. A wide field of view is desirable in most indoor environments, in order to find a sufficient number of landmarks, and to avoid losing all landmarks during rapid rotation or sideways translation. The visual stereoscopic camera is a Point Grey Bumblebee2 with a horizontal field of view of approximately 100 degrees and a resolution of 640 by 480 pixels. This camera is a complete stereo pair, which delivers internally rectified and synchronized images. The infrared stereo pair consists of two FLIR A35 units, which have been calibrated manually. The infrared cameras provide images at a resolution of 320 by 256 pixels and has a horizontal field of view of approximately 50 degrees. The IMU (a Xsens MTi-G) and all cameras are synchronized using trigger pulses, thereby eliminating any uncertainty about timing from the data acquisition. The system that we currently use is a prototype, and the sensors used are too large and heavy to be permanently mounted on a helmet. However, smaller and more lightweight sensors, which are more suitable for integration with an end-user system, exist. By integrating cameras and inertial navigation, very slow error accumulation (less than 1 % of the travelled distance) is obtained in most situations. Still, unless the system can recognize when specific locations are revisited, the error will grow over time, and eventually the precision of the position estimate will no longer be sufficient. Many SLAM systems, including ours, navigate by tracking landmarks observed in the images. In visual SLAM, landmarks are typically corners or other well-defined and recognizable points. Some systems keep track of previously observed landmarks for long times, and use recognized landmarks for “loop closure”, i.e., for updating the position estimate when a region is revisited. Other methods store previously seen views (camera images), and use these for closing loops. These techniques reduce the error growth if areas are revisited and the landmarks or images are recognized. As an alternative to the above solutions, we only keep track of landmarks for a short period of time (while the landmarks are continuously visible). For loop closure we instead match the three-dimensional point cloud created from all previous views to that which the camera currently sees. The advantage of this approach is that we are less dependent on returning to the same position and orientation to be able to associate the current position estimate with a previous one. Adding point cloud matching and thermal infrared cameras to our system improves the positioning of soldiers and first responders. This paper presents these recent additions to our positioning system. References 1. Rydell, J., Emilsson, E., “CHAMELEON: Visual-inertial indoor navigation”, Position Location and Navigation Symposium (PLANS), 2012 IEEE/ION, Myrtle Beach, South Carolina, USA 2. Emilsson, E.; Rydell, J.; “Sensor fusion for improved indoor navigation”. Proc. SPIE 8542, Electro-Optical Remote Sensing, Photonic Technologies, and Applications VI, 85420M (2012)
Published in: Proceedings of the 26th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2013)
September 16 - 20, 2013
Nashville Convention Center, Nashville, Tennessee
Nashville, TN
Pages: 737 - 745
Cite this article: Rydell, J., Emilsson, E., "Chameleon v2:Improved Imaging-inertial Indoor Navigation," Proceedings of the 26th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2013), Nashville, TN, September 2013, pp. 737-745.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In