Previous Abstract Return to Session D3 Next Abstract

Session D3: Aerial Vehicle Navigation

Relative Visual-Inertial Odometry for Fixed-Wing Aircraft in GPS-Denied Environments
Gary J. Ellingson, Brigham Young University; Kevin Brink, Air Force Research Laboratory; Timothy W. McLain, Brigham Young University
Location: Spyglass

In recent years, researchers in the Brigham Young University MAGICC Lab have been working on a unique approach for operating autonomous aircraft in GPS-denied environments, called relative navigation. It is a software framework that includes a relative front-end filter and a global back-end graph optimization. It has been shown to have some advantages over other similar approaches. The front-end includes an extended Kalman filter that fuses visual odometry and with the inertial measurements from acceleration and angular rate sensors. A significant component of the relative navigation architecture is the method used to account for global position and heading, since these states are unobservable without global measurements, such as GPS. The front-end filter includes a relative-reset step that zeros the absolute position and heading angle. It sends the transformation from just before the reset to the global back-end as an edge in a graph optimization. After the reset, it continues to updates the vehicle state relative to a new local origin. In prior work, the relative navigation framework has been demonstrated both in simulation and in flight on multi-rotor aircraft flying in cluttered environments. These demonstrations used depth sensor information from near-by objects in the environment to produce odometry measurements.
The objective of this work is to construct a relative, front-end odometry and filter that will enable relative navigation on fixed-wing aircraft. Since depth sensors, such as laser scanners and RGB-D cameras, are impractical for small, unmanned, fixed-wing aircraft that fly high above the ground, a different approach is required. This work presents the construction of a tightly-coupled, monocular, visual-inertial odometry that is capable of working with the relative navigation framework by incorporating the relative-reset step previously mentioned.
The creation of this filter began with the previously-developed, loosely-coupled odometry filter and replaced the measurement-update step with one appropriate for tightly-coupled, visual-inertial odometry. The multi-state constraint Kalman filter (MSCKF) is a monocular, visual-inertial odometry that has been demonstrated to be both accurate and consistent. The MSCKF is also appropriate for fixed-wing aircraft because it makes no assumption about the distance to observed features. The MSCKF functions by continuously adding past camera poses to the state vector. As tracked features leave the camera frame, a residual is created by first performing a least-squares optimization to produce the feature location. The optimization includes all the camera poses from which the feature was observed. The state vector is then updated and the old camera poses are discarded. The proposed paper will present the required visual feature tracker, feature-point optimization, and new measurement-update step.
In this work, the first camera pose added to the state vector is from a keyframe image. The keyframe image is at the local origin and heading aligned with the local coordinate frame, but may be pitched and rolled. All feature-point optimizations and measurement Jacobians are relative to the keyframe image. A new keyframe is declared, and the relative update step is performed, when a minimum number of feature tracks can no longer be maintained with the previous keyframe. When a new keyframe is declared, all previous camera poses are removed from the state vector and feature tracks are reinitialized. Relative transformations between keyframes are used in a global back end to produce an estimate of the global state.
The approach is demonstrated using a Gazebo simulation of a small fixed-wing aircraft with a simulated camera and inertial measurement unit (IMU). The simulated aircraft dynamics and sensor-noise characteristics are representative of those from an actual small, unmanned fixed-wing aircraft. The accuracy and consistency of the relative odometry are presented. This work is significant because it incorporates the advantages of relative navigation into an accurate visual-inertial odometry that is capable of operating on an autonomous fixed-wing platform.



Previous Abstract Return to Session D3 Next Abstract