Title: Low Cost, Standards based EO/IR Payload Simulation for Visual Aided Navigation Applications
Author(s): B. Thompson, C. Newborn, P. Jackson, T. Pitt, G. Reynolds
Published in: Proceedings of IEEE/ION PLANS 2018
April 23 - 26, 2018
Hyatt Regency Hotel
Monterey, CA
Pages: 447 - 455
Cite this article: Thompson, B., Newborn, C., Jackson, P., Pitt, T., Reynolds, G., "Low Cost, Standards based EO/IR Payload Simulation for Visual Aided Navigation Applications," Proceedings of IEEE/ION PLANS 2018, Monterey, CA, April 2018, pp. 447-455.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In
Abstract: As U.S. Department of Defense (DoD) programs begin to incorporate vision aiding into navigation systems of Unmanned Aircraft Systems (UAS), it becomes necessary to augment laboratory simulators of the UAS with realistic video simulation. Laboratory simulation for testing of medium to large UAS is attractive for managing overall program expenses and for removing the constraints on test execution. The benign dynamics of typical UAS along with the availability of Commercial off-theshelf (COTS) scene generation software with good terrain datasets allows the Gray Eagle Modeling, Navigation, and Integration (GEMNI) laboratory and the U.S. Army Product Office for Medium Altitude Endurance (PM-MAE) to pursue a low-cost approach using a high-end desktop Windows simulator. The capabilities and limitations of this approach will be discussed. A general simulator architecture will be presented for the generation of synthetic video to model typical electro-optical/infrared (EO/IR) fixed or gimbaled payload sources in applications where UAS are flying in standard airspace over mapped regions of land and at distances such that the map data resolution limitations are not significant. In the simulator architecture, a high-performance multi-core Windows desktop simulator is used to model the dynamics of the gimbal, emulate the software loaded on the sensor for controls, and host platform communications at rates of up to 512 Hz. Hard realtime simulation constraints and timing are allocated to hardware. The simulator interacts with the aircraft six degrees of freedom (6- DOF) simulator, the GEMNI Position Navigation and Timing (PNT) testbed, and the aircraft avionics data busses. COTS scene-generation software continues to improve as desktop computing and graphics technologies improve in performance and as the fidelity of data for Geospatial Information Systems (GIS) improves. Innovations such as the U.S. Geological Survey (USGS) National Elevation Dataset (NED) provide substantial improvements over Digital Terrain Elevation Data (DTED), the Shuttle Radar Topography Mission (SRTM), and the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) datasets for vertical accuracy of GIS. Improvements in desktop computer storage capacity and the utilization of commercial drones equipped with high resolution cameras and lidar/radar is also allowing the usage of high resolution terrain maps in GIS. With these improvements in available data and technology, it is possible to design EO/IR sensor simulators that support the operational envelope of UAS for visual navigation applications over a wide geographical area without the need for a lot of sitespecific GIS data collection and processing. An error analysis will be presented showing the coupling between the UAS performance and operating envelope to the GIS data quality and simulation timing errors. To accurately predict visual navigation performance in the laboratory environment, the sensor model itself must be accurate. With modern sensors now incorporating inertial sensors and doing master/slave transfer alignment from the aircraft platform as a way to improve their own navigation solution, these processes must be accurately modeled in the simulation. The error modeling approach will be discussed along with plans to perform model verification and validation (V&V). Simulation development efforts also benefit from DoD and industry standardization initiatives. UAS designs are increasingly standardizing their datalink and video products around North Atlantic Treaty Organization (NATO), Motion Imagery Standards Board (MISB), and Simulation Interoperability Standards Organization (SISO) standards. The Common Image Generator Interface (CIGI) is a standard that is finding increasing support within COTS/ Government off-the-shelf (GOTS) scene generator developers. Thus, any sensor simulation supporting CIGI will have some options for which scene generator to use for graphics rendering.