Abstract: | The signal processing used in many digital filter designs and implementations can be modeled as a linear system of equations. Solving this linear system of equations requires an estimate of the autocorrelation matrix and crosscorrelation vector for a given set of input measurement vectors. The autocorrelation matrix and the crosscorrelation vector are then used to yield the desired solution or state vector, as it is the case when performing the Kalman filter update and the multiplier computations. Based on the properties of the autocorrelation matrix and the application nature, among the other computational techniques, the recursive Cholesky and Modified Graham-Schmidt Orthogonalization (MGSO) can yield a fast, efficient, and robust computation of the desired solution, which is desirable because it reduces the system cost. The dense, recursive Cholesky’s and MGSO methods are developed, explained, and compared in this paper. After this comparison is completed, the recursive Cholesky’s method results twice as fast as the recursive MGSO algorithm and the first one is 5/3 times (or 67%) faster than the recursive Sherman Morrison Formula. Moreover, the recursive Cholesky’s and MGSO reduces the eigenvalue ration by the square root, which is very desirable for most applications. For this reason, the recursive Cholesky’s and MGSO algorithms should be implemented based on the development treaded here to yield maximum efficiency in computation time and memory storage. |
Published in: |
Proceedings of the 2002 National Technical Meeting of The Institute of Navigation January 28 - 30, 2002 The Catamaran Resort Hotel San Diego, CA |
Pages: | 655 - 665 |
Cite this article: | Progri, Ilir F., Michalson, William R., Bromberg, Matthew, "A Comparison Between the Recursive Cholesky and MGSO Algorithms," Proceedings of the 2002 National Technical Meeting of The Institute of Navigation, San Diego, CA, January 2002, pp. 655-665. |
Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |