Return to Session A5b Next Abstract

Session A5b: Integrated Inertial Navigation Systems

A Computer Vision Approach for Pedestrian Walking Direction Estimation with Wearable Inertial Sensors: PatternNet
Hanyuan Fu, Univ Gustave Eiffel, AME-GEOLOC; Thomas Bonis, Univ Gustave Eiffel, Univ Paris Est Creteil, CNRS, LAMA; Valerie Renaudin, Univ Gustave Eiffel, AME-GEOLOC; Ni Zhu, Univ Gustave Eiffel, AME-GEOLOC
Location: Spyglass
Date/Time: Thursday, Apr. 27, 10:40 a.m.

Abstract—In this paper, we propose an image-based neural network approach (PatternNet) for walking direction estimation with wearable inertial sensors. Gait event segmentation and projection are used to convert the inertial signals to image-like tabular samples, from which a Convolutional neural network (CNN) extracts geometrical features for walking direction inference. To embrace the diversity of individual walking characteristics and different ways to carry the device, tailor-made models are constructed based on individual users’ gait characteristics and the device-carrying mode. Experimental assessments of the proposed method and a competing method (RoNIN) are carried out in real-life situations and over 3 km total walking distance, covering indoor and outdoor environments, involving both sighted and visually impaired volunteers carrying the device in three different ways: texting, swinging and in a jacket pocket. PatternNet estimates the walking directions with a mean accuracy between 7 to 10 degrees for the three test persons and is 1.5 times better than RONIN estimates.
Index Terms—Indoor positioning, inertial sensors, pedestrian navigation, pedestrian dead reckoning, walking direction, deep learning



Return to Session A5b Next Abstract