A High-Precision Inertial Model with Spline-Based, Factor-Graph Optimization
Kyle A. Leland, Clark N. Taylor, David A. Woodburn, Autonomy & Navigation Technology Center, Air Force Institute of Technology, Randal Beard, Brigham Young University
Location: Beacon A
Precise and reliable platform localization is a core component for autonomous and unmanned navigation. Global navigation satellite systems (GNSS) are a prevailing source for determining a platform’s absolute position. However, GNSS-based positioning may be unreliable due to erroneous measurements or inconsistent availability in challenging environments, such as urban or noncooperative settings. To increase robustness against these challenges, GNSS sensors are commonly integrated with data from other sensors, including inertial measurements. Furthermore, inertial sensors typically form the backbone of any alternative navigation technology. Therefore, this paper introduces a novel technique for integrating inertial sensors with other measurements.
This paper pulls together three unique research threads to explore a novel methodology for using inertial sensors: factor graphs, spline-based trajectory estimation, and high-quality inertial sensing. Factor graphs offer a popular framework for fusing data from multiple sensors, and the literature presents various robust factor-graph formulations specifically evaluated in GNSS-challenged environments [1]. While factor graphs have been shown to perform very well in a variety of estimation scenarios, they can have difficulty with high-rate inertial sensors due to the large number of hidden variables this may require estimation. Therefore, alternative approaches such as inertial pre-integration have previously been introduced. These approaches, however, still require precise timing synchronization between the inertial sensors and other sensors as they are estimating discrete samples of the trajectory, and all measurements must occur on those samples.
Therefore, several recent papers have departed from the traditional, discrete-time representation of estimated states in the factor graph by using continuous-time, factor-graph formulations based on B-splines [2] [3] [4]. These papers have focused on creating trajectories in both the position and rotational spaces. This paradigm alleviates the difficulties of integrating unsynchronized data and of integrating data from a high-rate sensor, such as an inertial measurement unit (IMU). (Note that the spline-based factor graph was specifically to enable navigation with rolling-shutter and neuromorphic cameras, sensors that are very difficult to precisely synchronize using discrete samples of a trajectory.) However, the published spline-based, factor-graph implementations for inertial measurements employ a simplified inertial sensor model that does not account for the effects of the rotation of the Earth or a changing gravity vector.
In this work, we develop a novel inertial-measurement model for factor graphs. While previous models have assumed that the inertial sensors are control inputs to a dynamics model, we instead make the inertial measurements a function of two discrete time steps on the trajectory. This is a more natural fit with a factor-graph formulation and also enables more complex combinations of inertial sensors. (For example, inertial sensors of differing qualities on different axes can be supported with this formulation.) We model gyroscope measurements as the sum of the angular rate of rotation of the sensor frame with respect to the world frame, the angular rate of rotation of the world frame with respect to the inertial frame, the gyroscope bias, and additive white Gaussian noise, all represented in the sensor frame. We model accelerometer measurements as the sum of the acceleration of the sensor, the Coriolis effect from the rotation of the Earth, gravity as a function of position, and accelerometer bias, all represented in the sensor frame. Furthermore, we express all locations in an ECEF coordinate frame enabling inertial modeling over the poles of the Earth. We then modify this inertial sensor formulation for use with a B-spline parameterization of the trajectory. The position, velocity, acceleration, and angular velocity are determined by the estimated trajectory represented by a B-spline that models the platform’s motion in continuous-time. We also derive and provide the relevant Jacobian matrices for including these sensor models in a factor-graph formulation.
In the final paper, we will evaluate the refined sensor model in a continuous-time, factor-graph formulation that uses loosely coupled GNSS and inertial measurements. This formulation will be tested on a dataset of simulated measurements. The simulated scenarios will include both a reliable, open-sky GNSS environment as well as intermittent GNSS blackouts, representing a GNSS-denied environment. It will also include simulated measurements from IMUs with a range of performance spanning from commercial-grade to navigation-grade capabilities. The primary purpose of the experiment is to compare the positioning accuracy using inertial sensors in a spline-based factor graph with the previous techniques. The range of performance capabilities in the simulated IMU measurements will be used to provide insight into how the improved accuracy is related to the grade of the sensor. While we will only evaluate sensor fusion with GNSS and inertial measurements, additional sensors, such as odometers, magnetometers, and cameras, could also be integrated into a continuous-time, factor-graph formulation with well-established methods from literature.
References
1. Watson, Ryan M., Gross, Jason N., "Robust Navigation In GNSS Degraded Environment Using Graph Optimization," Proceedings of the 30th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2017), Portland, Oregon, September 2017, pp. 2906-2918. https://doi.org/10.33012/2017.15164.
2. Patron-Perez, A., Lovegrove, S. & Sibley, G. “A Spline-Based Trajectory Representation for Sensor Fusion and Rolling Shutter Cameras,” in International Journal of Computer Vision 113, 208–219 (2015). https://doi.org/10.1007/s11263-015-0811-3.
3. D. Hug, P. Bänninger, I. Alzugaray and M. Chli, "Continuous-Time Stereo-Inertial Odometry," in IEEE Robotics and Automation Letters, vol. 7, no. 3, pp. 6455-6462, July 2022, doi: 10.1109/LRA.2022.3173705.
4. G. Cioffi, T. Cieslewski and D. Scaramuzza, "Continuous-Time Vs. Discrete-Time Vision-Based SLAM: A Comparative Study," in IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 2399-2406, April 2022, doi: 10.1109/LRA.2022.3143303.