|Abstract:||Image-based positioning is a well known research area with several trusted applicable navigation solutions. The deployment of such systems has not been certified yet in aviation applications. In order to cover numerous environmental conditions various optical sensors shall be installed. This paper presents an approach for fusing image data of two complementary cameras with different spectral ranges. The usage of two images sensors working in the visible light spectrum and infrared spectrum increases availability and accuracy. Therefore, reliability shall meet requirements in order to be used as an augmentation systems of state of the art GNSS-based landing systems. This investigation presents real-flight data processed by means of the proposed method. Challenges and benefits are discussed contrary to a former approach of online-validation of position solutions extracted from both cameras. Flight trials have proven the potential of the application of hyper-spectral image sensors for runway recognition especially during difficult visual conditions such as low solar altitude, haze, mist or a combination thereof.|
Proceedings of the ION 2019 Pacific PNT Meeting
April 8 - 11, 2019
Hilton Waikiki Beach
|Pages:||752 - 766|
|Cite this article:||
Angermann, M., Wolkow, S., Dekiert, A., Bestmann, U., Hecker, P., "Linear Blend: Data Fusion in the Image Domain for Image-based Aircraft Positioning during Landing Approach," Proceedings of the ION 2019 Pacific PNT Meeting, Honolulu, Hawaii, April 2019, pp. 752-766.
ION Members/Non-Members: 1 Download Credit