Previous Abstract Return to Session A8 Next Abstract

Session A8: Space Applications for Cislunar and Beyond

Simulating Lunar Search and Rescue: Leveraging Unreal Engine and Deep Learning for Autonomous Aid Delivery
Benjamin Johnis, Clint Spesard, Robert Bettinger, Air Force Institute of Technology; Jeremy Correa, Katalyst Space.
Location: Ballroom B
Date/Time: Wednesday, Jun. 5, 9:35 a.m.

ABSTRACT
The vast and unforgiving lunar environment poses unique challenges for NASA’s future LunaSAR program, a technical
plan to employ Lunar search and rescue (SAR) tools for astronauts in distress. Autonomous and timely delivery of aid to
distressed personnel necessitates robust navigation and decision-making capabilities in the absence of readily available human
intervention. The simulation has significant potential applications in astronaut training, autonomous vehicle navigation algorithm
development, and communication infrastructure evaluation for space environments. The Unreal Engine framework integrates
real-world measurement data and deep learning models to create a practical and challenging training environment.
The NASA Exploration Systems Simulation (NExSyS) team at Johnson Space Center developed a graphical environment of
the Lunar South Pole region named the Digital Lunar Exploration Sites Unreal Simulation Tool (DUST) (Bingham et al.,
2023). The simulation landscape allows users to visualize the lunar surface using Digital Elevation Models (DEMs) and optical
imagery obtained from NASA’s Goddard Space Flight Center’s Scientific Visualization Studio. This immersive virtual reality
environment fosters a sense of spatial awareness and familiarity with the lunar landscape, crucial for effective SAR operations.
The lunar DEM is based on over 6.5 billion measurements gathered by the Lunar Orbiter Laser Altimeter (LOLA) on NASA’s
Lunar Reconnaissance Orbiter (LRO) between 2009 and 2013 (Smith et al., 2010; Tooley et al., 2010). The integration of
training data and surface operation simulations enables real-world applications for autonomous space vehicles that may operate
in concert with manned space missions. This is achieved through the integration of a deep learning model trained in a separate
”AirSim” environment (Shah et al., 2017). This model utilizes an end-to-end Convolution Neural Network (CNN) architecture,
where visual input from a front-facing camera on the autonomous vehicle is processed to generate steering commands for
navigating the lunar terrain. The AirSim environment allows for efficient and controlled training of the model under diverse
lighting and environmental conditions depicted in the Unreal Engine simulation. To add a layer of complexity and realism, the
simulation incorporates dynamic satellite positioning and error data obtained from our teams’ orbital constellation evaluation
tool. This data continuously updates the location and accuracy of communication/navigation satellites, affecting the vehicle’s
ability to receive instructions and maintain positioning solutions for navigation. This feature necessitates adaptive planning and
decision-making on the part of the autonomous vehicle, mimicking the challenges faced by real-world lunar rovers operating in
a limited communication environment.
Beyond the visual and environmental realism, the simulation also incorporates a concept of operation in which the planning
phase incorporates crew-deployed breadcrumbs or lit anchor points along the path of their vehicle. These markers serve as
waypoints for the autonomous vehicle, guiding it along a designated path toward the distressed personnel. This feature allows
for crew intervention and strategic planning, adding another layer of complexity to the SAR scenario. Astronauts can gain
valuable experience navigating the lunar terrain, making critical decisions under pressure, and collaborating with autonomous
vehicles for a variety of scenarios. Similarly, developers of autonomous navigation algorithms can utilize the simulation to
test and refine their models in a challenging environment. Finally, the simulation can be used to evaluate the performance of
communication infrastructure designed to NASA LunaNet specification for future lunar missions, ensuring reliable and efficient
communication channels even in the face of potential disruptions. This work paves the way for more effective and efficient lunar
SAR operations, ultimately contributing to the safety and well-being of future lunar explorers.

REFERENCES
Bingham, L., Kincaid, J., Weno, B., Davis, N., Paddock, E., & Foreman, C. (2023). Digital lunar exploration sites unreal
simulation tool (dust). 2023 IEEE Aerospace Conference, 1–12. https://doi.org/10.1109/AERO55745.2023.10115607
Shah, S., Dey, D., Lovett, C.,&Kapoor, A. (2017). Airsim: High-fidelity visual and physical simulation for autonomous vehicles.
Field and Service Robotics. https://arxiv.org/abs/1705.05065
Smith, D. E., Zuber, M. T., Neumann, G. A., Lemoine, F. G., Torrence, M. H., McGarry, J. F., Rowlands, D. D., et al.
(2010). Initial observations from the lunar orbiter laser altimeter (lola). Geophysical Research Letters, 37(18). https:
//doi.org/10.1029/2010GL043751
Tooley, C. R., Houghton, M. B., Saylor Jr., S. S., Peddie, C., Everett, D. F., Baker, C. L., & Safdie, K. N. (2010). Lunar
reconnaissance orbiter mission and spacecraft design. Space Science Review, 150, 23–62. https://doi.org/10.1007/
s11214-009-9624-4



Previous Abstract Return to Session A8 Next Abstract