This article presents the theoretical foundation and experimental results for a collaborative navigation scenario between manned and unmanned systems. The scenario includes several Unmanned Ground Vehicles (UGV) and/or dismounted soldiers operating in a GPS challenged or denied environment and Unmanned Aerial Vehicles (UAV) flying above the environment with access to GPS signals. The goal is a collaborative navigation architecture to provide relative position, velocity and attitude from the UAV to ground units. An evaluation of the performance of the collaborative navigation system was conducted on real-world data. A UAV instrumented with a payload consisting of GPS, MEMS IMU, LiDAR, and RGB Camera was flown over an area where several vehicles/people (features) were operating. The UAV is denoted as the Secondary unit. During GPS outages, the vision sensor onboard the UAV was used to detect and track the features. Each feature tracked is denoted as a primary unit, and the relative position obtained from the UAV was used in the relative navigation filter for each primary/secondary pair. Results show improved navigation performance when utilizing the relative observations during GPS outages. Specifically, the drift of the INS solution is bounded by the external measurements provided by the relative Extended Kalman Filter and the tracking filter when GPS is unavailable, maintaining the desired performance in GPS adverse conditions.