Abstract: | This paper introduces a novel self-localization method using passive optical data from stars and satellites captured with event cameras. Unlike conventional frame-based electro-optical (EO) or infrared (IR) cameras, event cameras report changes in brightness asynchronously, enabling precise low-latency tracking of motion. This unique sensing capability offers several advantages, such as low power consumption, high dynamic range, and enhanced sensitivity to dim objects against bright backgrounds, potentially allowing daytime detection of celestial objects. Leveraging these attributes, we develop an algorithm that processes event camera data to identify and track stellar and satellite objects using a random sample consensus (RANSAC)- based technique. The proposed approach exploits known star positions to orient the optical sensor’s focal plane and uses the satellite’s associated events for precise self-localization. A factor graph framework integrates the sensor’s estimated position, satellite ephemeris, and event camera data to refine the observer’s latitude and longitude. Our results demonstrate the algorithm’s effectiveness in GPS-denied environments, showing promise for accurate self-localization even in challenging lighting conditions. This is the first known application of event cameras for self-localization with optical observations of satellites, extending prior work in celestial navigation and in using event cameras for space-based applications. |
Published in: |
Proceedings of the 2025 International Technical Meeting of The Institute of Navigation January 27 - 30, 2025 Hyatt Regency Long Beach Long Beach, California |
Pages: | 69 - 79 |
Cite this article: | Nelson, Kaleb, Taylor, Clark, Oliver, Rachel, "Event-Based Vision and Factor Graph-Based Approach for Sensor Geolocation from Observations of an Orbiting Satellite," Proceedings of the 2025 International Technical Meeting of The Institute of Navigation, Long Beach, California, January 2025, pp. 69-79. https://doi.org/10.33012/2025.20002 |
Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |