Previous Abstract Return to Session D1

Session D1: Robotic and Indoor Navigation

Vision-aided Inertial Navigation for Head-worn Wearable Devices
Ahmed Arafa, Simon Fraser University; Neda Parnian, Intel Corp.; Edward J. Park, Simon Fraser University
Location: Spyglass
Alternate Number 2

Objective
The aim of this work is to perform real-time vision-aided inertial navigation for head-worn wearable devices. For such devices, the use of an Inertial Measurement Unit (IMU) to compute heading in a Pedestrian Dead Reckoning (PDR) framework is challenging due to the continuous motion of the user’s head. We propose augmenting an IMU with a monocular camera to compensate for the user’s head motion and provide a reliable heading estimate in GPS-denied environments.
Motivation
Earlier work on head-worn PDR assumed that the user’s head is always in the direction of motion [1] which is unrealistic. Just recently, the authors in [2] and [3] tried to address this problem. In [2], the authors augmented the head-worn measurements with that of a smartwatch to improve PDR heading estimation by finding the direction of arm-swing. This was done by performing the Principal Component Analysis (PCA) on the smartwatch’s horizontal acceleration defined in the navigational frame.
In [3], the authors used only a head-mounted IMU sensor containing a tri-axial accelerometer and a tri-axial gyroscope. Based on the gyroscope output, a threshold is employed to determine whether the user changed their heading direction, or whether the change was due to head motion. One drawback is that the system could not distinguish between simultaneous head and body rotations (i.e. could not detect turns).
The work proposed herein aims to fuse the readings from an IMU and a monocular camera onboard a consumer head-worn wearable device to reliably determine heading direction. A Multi-State Constraint Kalman Filter (MSCKF) [4] is proposed that combines the camera’s optical flow features [5] with the IMU heading-yaw angle. To validate the algorithm, a Recon Instruments’ Recon Jet computing goggles were employed as our experimental platform. The Recon Jet contains a tri-axial accelerometer, gyroscope, magnetometer, a monocular camera and a GPS receiver.
Methodology and Results
The problem of heading estimation can be divided into three cases as follows. The first case represents the trivial case where the user’s head is pointing in the direction of travel. The second case represents the case where the user’s head is not in the direction of motion; for instance, a user is looking to their right or left while walking straight. The third case represents the case where the user performs an actual turn (simultaneous head and body motion).
Our preliminary results show that the proposed MSCKF that uses the IMU derived heading in the prediction phase and corrects for it using the camera’s optical flow can reliably distinguish between the three cases above, resulting in an accurate PDR solution.
The accuracy of the proposed MSCKF PDR solution is compared in a map against the GPS location reported by the Recon Jet. The algorithm is also tested on multiple human participants to assess its reliability and robustness.

References
[1] S. Beauregard, “A helmet-mounted pedestrian dead reckoning system,” 3rd International Forum on Applied Wearable Computing, pp.1 – 11, 2006
[2] D. Loh, S. Zihajehzadeh, R. Hoskinson, H. Abdollahi and E. J. Park, "Pedestrian Dead Reckoning With Smartglasses and Smartwatch," in IEEE Sensors Journal, vol. 16, no. 22, pp. 8132-8141, 2016.
[3] J. Windau and L. Itti, "Walking compass with head-mounted IMU sensor," IEEE International Conference on Robotics and Automation (ICRA), pp. 5542-5547, 2016.
[4] A.I. Mourikis and S.I. Roumeliotis. A multi-state constraint kalman filter for vision aided inertial navigation, IEEE International Conference on Robotics and Automation (ICRA), pp. 3565–3572, 2007.
[5] T. Kroeger, R. Timofte, D. Dai, L. Van Gool, "Fast optical flow using dense inverse search", European Conference on Computer Vision, pp. 471-488, 2016.



Previous Abstract Return to Session D1