Multi-sensor Fusion for Autonomous Deep Space Navigation

Chengjun Guo, Shiyan Deng, Yalan Xu, Jing He

Abstract: The distance of deep space exploration is far away from earth, the communication delay is not conducive to the handling of emergencies and detector failures, and the survival of spacecraft faces a huge risk. Therefore, in the deep space, the spacecraft must rely on its own sensors to achieve positioning and environmental perception. The key problem of autonomous navigation in deep space is the nonlinear state estimation of the spacecraft in a dynamic 3D environment. However, the use of a single sensor in deep space usually has a big error, and to overcome this limitation, the state can be corrected using observations of multiple known objects in the world. This is referred to as multi-sensor fusion, since the information of multiple sensors needs to be combined in order to obtain an estimation of the state. In this paper, we combine the observations of stars in the deep space with the dynamics of celestial objects to analyze the state of spacecraft. We proposed a multi-sensor fusion method based on the square root cubature kalman filter (SRCKF) to solve the multi-sensor fusion problem of spacecraft. Specifically, we perform the improved cubature kalman filter on sensor data obtained from interferometer, star tracker, planet sensor, sun sensor, and earth sensor. In order to solve the problem of high computational complexity when SRCKF is used in nonlinear filtering problems, the QR decomposition is used to pass the covariance array square root factor, which reduces the amount of computation, avoids negative matrix determination, and improves the filtering accuracy. Then we incorporate the idea of extended information filter, according to the nature of statistical linear error propagation, the update process of the cubature kalman filter is embedded into the extended information filter framework. Specifically, the information contribution vector and information contribution matrix are proposed as the evaluation of each sensor's contribution in the filter estimation composition. And the square root of the covariance in the SRCKF process is implemented by updating the square root of the information contribution matrix. The results show that the proposed multi-sensor fusion algorithm outperforms the SRCKF and EKF algorithms in terms of position and velocity estimation.
Published in: Proceedings of the 33rd International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2020)
September 21 - 25, 2020
Pages: 290 - 299
Cite this article: Guo, Chengjun, Deng, Shiyan, Xu, Yalan, He, Jing, "Multi-sensor Fusion for Autonomous Deep Space Navigation," Proceedings of the 33rd International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2020), September 2020, pp. 290-299.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In