Performance Risk of GBAS Integrity Monitor Algorithms due to Reference Receiver Clock Adjustment Effect

Youngsun Yun, Jeongho Cho, Moon-Beom Heo

Abstract: For aircraft precision approach and landing, pilots need their own accurate positions with high reliability. GBAS (Ground Based Augmentation System) for GNSS (Global Navigation Satellite System) is designed to provide correction and integrity information for airborne receivers so that they can estimate the positions accurately and safely using GNSS measurements. The correction information includes pseudorange corrections (PRCs) and range rate corrections (RRCs), which is the change rate of PRC, for visible GNSS satellite measurements. The integrity information consists of spr_gnd and B-values, which represent the uncertainty of PRC and the discrepancies between the individual PRCs for reference receivers and the broadcast PRC, respectively. When GBAS generates the PRCs, it needs to adjust the reference receiver clock offset to limit the PRC values within a certain range specified by the international standards. Therefore, the effect of the clock adjustment is delivered to PRCs and B-values. In the previous research [1], we showed that there could be the possibility of higher integrity or continuity risk than required according to the system’s spr_gnddetermination algorithm. If the system estimates spr_gndwithout considering the clock adjustment effect, the sigma could be inflated or deflated than the true, which causes higher missed detection or false alarm rate, respectively. On the other hand, if the system utilizes spr_gnd with consideration of the clock adjustment effect but missing the correlation between different satellites’ sigmas, the system could suffer from the protection level integrity risk. This paper focuses on the first case and investigates the effect of clock adjustment on GBAS integrity monitor algorithms, especially Sigma-Mean Monitor algorithms including MRCC (Multiple Redundancy Consistency Check), sigma estimation and mean estimation methods. MRCC compares B-values to the pre-defined thresholds that are based on spr_gnd values to detect a faulty receiver measurement. And sigma-mean estimation monitors B-values which are normalized by spr_gnd and number of reference receivers to ensure that the broadcast sigma bounds the true sigma. Therefore, the inflated or deflated sigma could result in missed detection or fault free alarm, which leads to longer time-to-alert and higher continuity risk than expected. This paper analyzes the missed detection and false free detection rate sensitivity of the integrity monitor algorithms theoretically and demonstrates the results by a simulation. With the results, we could notice the importance of the clock adjustment effect of GBAS reference station on the system continuity and integrity performance.
Published in: Proceedings of the 24th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS 2011)
September 20 - 23, 2011
Oregon Convention Center, Portland, Oregon
Portland, OR
Pages: 3052 - 3060
Cite this article: Yun, Youngsun, Cho, Jeongho, Heo, Moon-Beom, "Performance Risk of GBAS Integrity Monitor Algorithms due to Reference Receiver Clock Adjustment Effect," Proceedings of the 24th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS 2011), Portland, OR, September 2011, pp. 3052-3060.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In