Previous Abstract Return to Session A5 Next Abstract

Session A5: Alternative Sensors for Aiding INSs and Precision Timing

Coarse Geo-Registration Over A Large Area
Sean Lantto, West Virginia University; Clark Taylor, Don Venable, Air Force Research Laboratory; Jason Gross, West Virginia University
Location: Big Sur

Many aircraft use a coupled inertial navigation and GNSS system to provide a attitude, position, and velocity estimates. The measurements from the inertial navigation system (INS) suffer from an unbounded error, that accumulates over time. This ”drift” is corrected using measurements from the GNSS, integrated through a Bayesian estimator such as a Kalman filter [1]. An aircraft that has been denied GNSS positioning for an extend period will suffer from a very large error in its position, due to the INS drift. Cameras on board the aircraft provide a method to localize the aircraft and correct for INS drift, and methods to do so have been developed [2,3,4]. This work presents a georegistration based approach to localize an aircraft within a large geographic region using a particle filter, where a large geographical region is defined as hundreds to thousands of square kilometers. This is done using a database of georegistered Scale Invariant Feature Transform (SIFT) features from orthorectified or satellite imagery, and matching them to features in the query image from the aircraft. These feature matches are used as the observables for the measurement update of the particle filter.
Each one of these SIFT features in the georegistered database are given a descriptor, which is useful for sorting the database; as well as a WGS-84 location. The WGS-84 location is determined using image geographic metadata and a digital elevation model (DEM). For this work, the database of SIFT features is created using orthorectified imagery of Dugway Proving Grounds (a ~60x60 km area), and the DEM is from the Shuttle Radar Topography Mission (SRTM). The SIFT features are generated using the OpenCV python library[7], and the database is sorted by the SIFT feature descriptor response value.
SIFT features are generated for each image taken by the aircraft, and are sorted by the descriptor response value, just as the database features were. The Fast lookup for Approximation of Nearest Neighbor (FLANN) [8] matching algorithm is used to compare each SIFT feature generated in the query image to every feature in the database. This returns a list of the nearest neighbors, or the database features closest in appearance to the query image feature being compared against. Using the L2 norm of the difference between the query image feature and the two nearest database features gives a distance between each neighbor. Taking the complement of the ratio of these distances, creates a weight that is assigned to the database feature that is the closest in appearance to the query image feature. These weighted georegistered database features are saved for each image, and are used as the observations in the measurement model of the particle filter. Unfortunately the feature matching process results in a large number of false positives, as the features from the confined query image are compared against the entire large search area. A particle filter is chosen to reduce the impact of false positives that result from the feature matching process. The particle filter initializes with a uniform particle distribution over the entire geographic region. Each particle has a center WGS-84 location and a radius, which corresponds to the location and field of view respectively.
Particle radius is allowed to vary between 25 meters and 1000 meters. Each particle is propagated between images using the measurements from the aircraft’s INS. The estimated position is a weighted average of the particle centers. Each particle is weighted based on the weights of the database features found within it and the radius of the particle.
During resampling some of the particles are redistributed uniformly across the search area. This is to allow the particle filter to find the correct observations in the case of convergence on false positive feature matches. The number of particles that are redistributed is a percentage of the number of elapsed images since the last resample. Using flight data and imagery from Dugway Proving Grounds, the particle filter was able to successfully estimate the aircraft’s location. Due to having no prior knowledge of the aircraft’s location, and the uniform particle distribution, the initial error in the estimated location is very large until the filter converges. Unfortunately, the first part of the flight (from t=0s to t= 600s)produced few features, and few good matches, also contributing to the large initial error. This method could also be applied to aerial full motion video data, that has unreliable metadata. Moving forward, changes to the particle filter will be made to include altitude, velocity, and heading states. The filter will also be changed to include the feature matching process, “on the fly”, where instead of determining the matches across the search area prior to running the filter, the features will be matched just prior to the measurement update, and will only search for matches in the area defined by the particles. This will be more similar to in flight operation, and once the particles converge be computationally efficient, as it no longer has to compare the query image features to every other feature in the entire search area. Use of an algorithm such as RANSAC will be investigated for forming homographies (pattern matching) to further reduce
the impact of false positives.
[1] E. Kaplan and C. Hegarty, Understanding GPS: principles and applications. Artech house, 2005.
[2] H.-P. Chiu, A. Das, P. Miller, S. Samarasekera, and R. Kumar, “Precise vision-aided aerial navigation,” in Intelligent Robots and Systems (IROS 2014), 2014 IEEE/RSJ International Conference on. IEEE, 2014, pp. 688–695.
[3] M. M. Veth and J. Raquet, “Fusing low-cost image and inertial sensors for passive navigation,” Navigation, vol. 54, no. 1, pp. 11–20, 2007.
[4] D. T. Venable and J. F. Raquet, “Large scale image aided navigation,” IEEE Transactions on Aerospace and Electronic Systems, vol. 52, no. 6, pp. 2849–2860, 2016.
[5] T. Lindeberg, “Scale invariant feature transform,” Scholarpedia, vol. 7,
no. 5, p. 10491, 2012.
[6] T. G. Farr, P. A. Rosen, E. Caro, R. Crippen, R. Duren, S. Hensley, M. Kobrick, M. Paller, E. Rodriguez, L. Roth et al., “The shuttle radar topography mission,” Reviews of geophysics, vol. 45, no. 2, 2007.
[7] R. Laganière, OpenCV 3 Computer Vision Application Programming
Cookbook. Packt Publishing Ltd, 2017.
[8] M. Muja and D. Lowe, “Flann-fast library for approximate nearest neighbors user manual,” Computer Science Department, University of British Columbia, Vancouver, BC, Canada, 2009.
[9] S. Thrun, W. Burgard, and D. Fox, Probabilistic robotics. MIT press,
2005.



Previous Abstract Return to Session A5 Next Abstract