|Abstract:||In this research, the task of object recognition and relative navigation is accomplished by fusing visible spectrum and infrared images. The appearance matching technique is briefly explained and it is shown how it can be extended to infrared images. A series of tests are performed to demonstrate the object recognition and pose estimation capabilities of the system in the visible and infrared spectra. It is also shown how the fusion of both types of images can provide greater accuracy and robustness in relative navigation than either visual or infrared images alone. Additionally, a simulation environment software tool has been developed to facilitate the creation of training images and to perform software-in-the-loop verification.|
Proceedings of the 2017 International Technical Meeting of The Institute of Navigation
January 30 - 2, 2017
Hyatt Regency Monterey
|Pages:||301 - 312|
|Cite this article:||
McBryde, Christopher R., Lightsey, E. Glenn, "Spacecraft Relative Navigation Using Appearance Matching and Sensor Fusion," Proceedings of the 2017 International Technical Meeting of The Institute of Navigation, Monterey, California, January 2017, pp. 301-312.
ION Members/Non-Members: 1 Download Credit