Tight Integration of Vision/Radar with INS/GNSS for Reliable Navigation in Autonomous Applications

Dylan Krupity, Noah Giustini, Igor Popov, Parham Nooralishahi, Hallet Duan, Mohammad Mohammadi Jahromi, Zhengwei Li, Jacques Georgy, and Christopher Goodall

Abstract: The fusion of data from different sensors makes localization more accurate and reliable. A common method is to integrate perception sensors with navigation systems to improve position estimates. The addition of perception sensors helps reduce the limitations of other technologies, such as GNSS multipath or signal loss in urban areas or tunnels, and INS drift over time. This paper presents AUTO, a real-time navigation system that uses data from the global navigation satellite system (GNSS), inertial navigation system (INS), odometer and perception sensors (such as radar and stereo cameras) to estimate a platform’s position reliably and accurately. AUTO demonstrates a novel method for positioning that uses a combination of stereo vision and radar sensors without requiring a pre-built map. The presented approach avoids many common problems associated with a dependency on High Definition (HD) maps, including high financial cost, large data storage and processing requirements, and the need to update maps often in dynamic environments. AUTO uses a tight non-linear integration of vision/radar with INS/GNSS to build maps in-run during the current session, and uses those maps to aid the navigation solution when GNSS is unreliable. The presented methodology shows how this information is used as a form of absolution positioning updates to the solution. The results demonstrate how AUTO uses the in-run maps to achieve a lane level solution in high multipath and urban canyon environments. It is also shown how the solution can successfully handle temporary loss of GNSS by simulating GNSS outages and calculating the errors during such outages. AUTO was tested in many different environments, locations, lighting, and weather conditions to ensure the robustness and reliability needed for autonomous applications.
Published in: Proceedings of the 38th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2025)
September 8 - 12, 2025
Hilton Baltimore Inner Harbor
Baltimore, Maryland
Pages: 576 - 591
Cite this article: Krupity, Dylan, Giustini, Noah, Popov, Igor, Nooralishahi, Parham, Duan, Hallet, Jahromi, Mohammad Mohammadi, Li, Zhengwei, Georgy, Jacques, Goodall, Christopher, "Tight Integration of Vision/Radar with INS/GNSS for Reliable Navigation in Autonomous Applications," Proceedings of the 38th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2025), Baltimore, Maryland, September 2025, pp. 576-591. https://doi.org/10.33012/2025.20398
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In