Spacecraft Relative Navigation Using Appearance Matching and Sensor Fusion

Christopher R. McBryde and E. Glenn Lightsey

Peer Reviewed

Abstract: In this research, the task of object recognition and relative navigation is accomplished by fusing visible spectrum and infrared images. The appearance matching technique is briefly explained and it is shown how it can be extended to infrared images. A series of tests are performed to demonstrate the object recognition and pose estimation capabilities of the system in the visible and infrared spectra. It is also shown how the fusion of both types of images can provide greater accuracy and robustness in relative navigation than either visual or infrared images alone. Additionally, a simulation environment software tool has been developed to facilitate the creation of training images and to perform software-in-the-loop verification.
Published in: Proceedings of the 2017 International Technical Meeting of The Institute of Navigation
January 30 - 2, 2017
Hyatt Regency Monterey
Monterey, California
Pages: 301 - 312
Cite this article: McBryde, Christopher R., Lightsey, E. Glenn, "Spacecraft Relative Navigation Using Appearance Matching and Sensor Fusion," Proceedings of the 2017 International Technical Meeting of The Institute of Navigation, Monterey, California, January 2017, pp. 301-312. https://doi.org/10.33012/2017.14889
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In