|Abstract:||Relative navigation capability is crucial to autonomous operations of unmanned aircraft systems (UAS) in GPS-degraded or denied environments. This paper focuses on the relative pose estimation problem using a collaborative stereo vision system based on a UAS formation with a single camera installed on each team member. The relative pose is reconstructed by combining Structure from motion (SFM) results with ranging measurements in between vehicles. The concept is validated using simulation data and ground test data using a RGBD camera and an indoor Vicon motion capture system. The stereo vision ranging estimates matches with collected RGBD range measurements. Similar concept is also tested using a single UAS flight data to simulate UAS leader-follower formation flight.|
Proceedings of the ION 2017 Pacific PNT Meeting
May 1 - 4, 2017
Marriott Waikiki Beach Resort & Spa
|Pages:||1077 - 1081|
|Cite this article:||
Chao, Haiyang, Brink, Kevin, Miller, Mikel, "Collaborative Stereo Vision based Relative Pose Estimation Using UAS Formation Flight," Proceedings of the ION 2017 Pacific PNT Meeting, Honolulu, Hawaii, May 2017, pp. 1077-1081.
ION Members/Non-Members: 1 Download Credit