Previous Abstract Return to Session C4 Next Abstract

Session C4: Vision-based Navigation Systems

Addressing UAS Effects on ORB SLAM2 Localization Quality
Gordon Keller, University of California, Santa Cruz; Nicholas Cramer, NASA Ames Research Center; Mircea Teodorescu, University of California, Santa Cruz
Location: Atrium Ballroom
Alternate Number 4

Localization resolution and accuracy is a key limiting factor in generalized use and regulation of small UASs in commercial airspace. Inexpensive GPS modules serve as the primary localization element for these aircraft but as standalone units can only achieve approximately one meter resolution. Further, concentrations of large obstructions (especially in urban settings) significantly diminish GPS reliability which limits freedom to autonomously operate where desired. Sensor fusion with onboard IMUs along with other sensors has improved geopositioning accuracy, but continued tracking with loss of GPS (e.g. solely tracking via odometry) is insufficient for confident autonomy. Onboard sensors and algorithms for environment mapping can reinforce this confidence in vehicle positioning, and Visual Simultaneous Localization and Mapping (SLAM) is a strong candidate for this. ORB SLAM2, a popular implementation of Visual SLAM, is validated in literature against datasets dissimilar to UAS implementation. This gap is addressed in our work by subjecting a hexacopter utilizing ORB SLAM2 online to common maneuvers and situations likely to be encountered by a multirotor system in the field with a rigidly mounted monocular camera (i.e. without a stabilizing gimbal). The primary goals of this work are to (a) identify optimal and adequate operating conditions for systems similar to ours for expedited system development and (b) investigate where improvements to Visual SLAM may serve UASs well.
ORB SLAM2 minimizes computational complexity by only performing global BA when necessary (e.g. during map initialization). Otherwise, it relies heavily on local BA, motion-only BA, etc. in the SLAM pipeline for efficiency. Our experimentation reveals undesirably large reprojection error leading to significant vehicle pose inaccuracy when subjected to multirotor movement. When using vision pose estimation as the primary localization source within Extended Kalman Filtering (EKF2) fusion, this limits the confidence in accuracy resulting in less optimal control and higher likelihood of collision within the environment. We explore improvements specifically targeting UAV pose accuracy robust to vehicle maneuvers by tailoring BA implementation to UAS-specific implementation.
Outside of changes to ORB SLAM2, changes to the control scheme of the UAS can enhance localization quality when using Visual SLAM. Reprojection error minimization (for ORB SLAM2 corner features/edges tracking) is the basis for our approach. Abrupt direction changes, rotation effects, and motion blur are considered as tangible “UAS effects” on SLAM quality and can extend the error functions into a means of error prediction. Minimizing predicted error when restricted by operational criteria (e.g. minimum horizontal velocity, maximum mission time, etc.) dictates vehicle behavior to mitigate ORB SLAM2 sensitivities to the aforementioned UAS effects. By studying combined metrics including reprojection error, ORB match count, and vehicle inertial measurements in a well-structured environment, we prove the effectiveness of our approach and identify room for improvement in ORB SLAM2 (and Visual SLAM more broadly) implementation aboard UASs.



Previous Abstract Return to Session C4 Next Abstract