Previous Abstract Return to Session C2 Next Abstract

Session C2: Vision/Integrated Navigation Systems

Low Cost, Standards based EO/IR Payload Simulation for Visual Aided Navigation Applications
Ben Thompson, Dynetics; Gregory Reynolds, Timothy Pitt, Army AMRDEC; Paul Jackson, Peter Duong, Craig Newborn, Dynetics
Location: Windjammer

Author(s): Ben Thompson, Gregory Reynolds, Timothy Pitt, Paul Jackson, Peter Duong, Craig Newborn
Title: Low cost, standards based EO/IR payload simulation for visual aided navigation applications
As U.S. Department of Defense (DoD) programs begin to incorporate vision aiding into navigation systems of Unmanned Aircraft Systems (UAS) it becomes necessary to augment laboratory simulators of the UAS with realistic video simulation. Laboratory simulation for testing of medium to large UAS is attractive for managing overall program expenses and for removing the constraints on test execution. The benign dynamics of typical UAS along with the availability of Commercial Off The Shelf (COTS) scene generation software with good terrain datasets allows the Gray Eagle Modeling, Navigation and Integration (GEMNI) laboratory and the U.S. Army Product Office for Medium Altitude Endurance (PM-MAE) to pursue a low cost approach using a high end desktop Windows simulator. The capabilities and limitations of this approach will be discussed. A general simulator architecture will be presented for the generation of synthetic video to model typical EO/IR fixed or gimbaled payload sources in applications where UAS are flying in standard airspace over mapped regions of land and at distances such that the map data resolution limitations are not significant.
In the simulator architecture, a high performance multi-core Windows desktop simulator is used to model the dynamics of the gimbal and emulate the software loaded on the sensor for controls and host platform communications at rates of up to 512 Hz. Hard real time simulation constraints and timing are allocated to hardware. The simulator interacts with the aircraft 6-DOF simulator, the GEMNI Position Navigation and Timing (PNT) testbed, and with the aircraft avionics data busses.
COTS scene generation software continues to improve as desktop computing and graphics technologies improve in performance, and as the fidelity of data for Geospatial Information Systems (GIS) improves. Innovations such as the U.S. Geological Survey (USGS) National Elevation Dataset (NED) provide substantial improvements over DTED and the SRTM/ASTER datasets for vertical accuracy of GIS. Improvements in desktop computer storage capacity and the utilization of commercial drones equipped with high resolution cameras and LIDAR/RADAR is also allowing the usage of high resolution terrain maps in GIS.
With these improvements in available data and technology, it is possible to design EO/IR sensor simulators that support the operational envelope of UAS for visual navigation applications over a wide geographical area without the need for a lot of site-specific GIS data collection and processing. An error analysis will be presented showing the coupling between the UAS performance and operating envelope to the GIS data quality and simulation timing errors.
To accurately predict visual navigation performance in the laboratory environment, the sensor model itself must be accurate. With modern sensors now incorporating inertial sensors and doing master/slave transfer alignment from the aircraft platform as a way to improve their own navigation solution, these processes must be accurately modeled in the simulation. The error modeling approach will be discussed along with plans to perform model verification and validation (V&V).
Simulation development efforts also benefit from DoD and industry standardization initiatives. UAS designs are increasingly standardizing their datalink and video products around NATO, MISB, and SISO standards. The Common Image Generator Interface (CIGI) is a proposed standard that is finding increasing support within COTS/GOTS scene generator developers. So any sensor simulation supporting CIGI will have some options for which scene generator to use for graphics rendering.

DISTRIBUTION A. Approved for public release: distribution unlimited.



Previous Abstract Return to Session C2 Next Abstract