Abstract: | Global Navigation Satellite System (GNSS) is a commonly used sensor for Unmanned Aerial Vehicle (UAV) localization. However, GNSS-based localization is affected by challenges at low altitudes due to issues such as multipath interference and Non-Line-of-Sight (NLOS) signals which degrade positioning accuracy. To address these challenges, we propose a sensor fusion framework by integrating Neural Radiance Fields (NeRFs) with Graph Neural Network (GNN)-enhanced GNSS positioning. NeRFs provide a compact, continuous, and detailed 3D representation of urban landscapes, offering photorealistic views that enable precise map-matching and environmental reconstructions. In our framework, we train a NeRF model to accurately represent the environment in which the UAV is flying and use the discrepancy between the UAV-captured images and NeRF-rendered scenes to compute visual corrections. These corrections are then integrated with the GNSS corrections learned by the GNN and used to predict the UAV’s absolute position. The proposed approach is validated on UAV real-world urban environment datasets with emulated GNSS pseudorange measurements. The results demonstrate that our framework improves UAV localization accuracy compared to traditional GNSS and camera-based positioning methods under different measurement noise conditions. |
Published in: |
Proceedings of the 2024 International Technical Meeting of The Institute of Navigation January 23 - 25, 2024 Hyatt Regency Long Beach Long Beach, California |
Pages: | 604 - 617 |
Cite this article: | Mohanty, Adyasha, Gao, Grace, "Fusing GNN-enhanced GNSS with Neural Radiance Fields for UAV Navigation," Proceedings of the 2024 International Technical Meeting of The Institute of Navigation, Long Beach, California, January 2024, pp. 604-617. https://doi.org/10.33012/2024.19518 |
Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |