CubeSat-Based Lunar Map Refinement Utilizing Surface Beacons and a Monocular Camera

Tyler Gardner, Michael Hansen, Natalie Wisniewski, and Randall Christensen

Abstract: SpaceX, Blue Origin, NASA, and others have re­cently proposed autonomous missions in preparation for new manned missions to the moon. Traditional approaches based solely on inertial navigation are not accurate enough to au­tonomously land a vehicle on hazardous lunar terrain, therefore Terrain Relative Navigation is being explored to supplement iner­tial navigation. Terrain Relative Navigation (TRN) is a capability that uses images of local terrain captured with a camera and/or imaging LIDAR to estimate the position and/or velocity of a spacecraft. Many TRN methods estimate the craft's absolute position by comparing sensor imagery to a crater/landmark database, global map, or another similar reference set. Because of the limited availability of high resolution lunar maps, the global accuracy of lunar TRN is currently limited to approximately 100 m. One compelling solution to improving global map resolution is to utilize an array of low cost CubeSats to image the lunar surface and refine existing maps. This paper explores the effectiveness of such a mission. In particular, the objective of this analysis is to determine the sensitivity of mapping uncertainty to sensor errors.
Published in: 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS)
April 20 - 23, 2020
Hilton Portland Downtown
Portland, Oregon
Pages: 1536 - 1546
Cite this article: Gardner, Tyler, Hansen, Michael, Wisniewski, Natalie, Christensen, Randall, "CubeSat-Based Lunar Map Refinement Utilizing Surface Beacons and a Monocular Camera," 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS), Portland, Oregon, April 2020, pp. 1536-1546.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In