Characterization of Feature Matching Errors for Consistent Estimation in Vision-Aided Navigation
Chun Yang, Ananth Vadlamani, Andrey Soloviev, Michael Veth, QuNav, LLC; Clark Taylor, AFRL/RYAR
Location: Grand Ballroom F
Date/Time: Wednesday, Jan. 31, 8:35 a.m.
In vision-aided navigation, images from natural scenes are processed by a chain of complicated algorithms from feature extraction, feature matching, to frame-to-frame tracking to produce vision measurements for navigation aiding. Such vision measurements are subject to non-Gaussian errors and in particular outliers, which, if not properly accounted for, lead to inconsistent, usually optimistic, estimation. The Assured Vision-Aided Inertial Localization (AVAIL) mechanization is recently set forth, which augments the inertial navigation with the probabilistic data association filtering (PDAF) that adaptively computes the probability that an outlier is undetected and weights vision measurements accordingly. Based on a rather general assumption about the outlier distribution, the AVAIL mechanization is shown to be consistent in the presence of real-world image processing errors. In this paper, we study feature matching errors and especially the circumstances in which outliers occur. We do this by checking the extracted features and detected outliers against the original images in the real-world environments so as to verify spatial and temporal assumptions about the feature errors and their distributions. The study shows evidences that lend a strong support to the AVAIL mechanization.