A Computer Vision Approach for Pedestrian Walking Direction Estimation with Wearable Inertial Sensors: PatternNet

Hanyuan Fu, Thomas Bonis, Valerie Renaudin, Ni Zhu

Abstract: Abstract—In this paper, we propose an image-based neural network approach (PatternNet) for walking direction estimation with wearable inertial sensors. Gait event segmentation and projection are used to convert the inertial signals to image-like tabular samples, from which a Convolutional neural network (CNN) extracts geometrical features for walking direction inference. To embrace the diversity of individual walking characteristics and different ways to carry the device, tailor-made models are constructed based on individual users’ gait characteristics and the device-carrying mode. Experimental assessments of the proposed method and a competing method (RoNIN) are carried out in real-life situations and over 3 km total walking distance, covering indoor and outdoor environments, involving both sighted and visually impaired volunteers carrying the device in three different ways: texting, swinging and in a jacket pocket. PatternNet estimates the walking directions with a mean accuracy between 7 to 10 degrees for the three test persons and is 1.5 times better than RONIN estimates. Index Terms—Indoor positioning, inertial sensors, pedestrian navigation, pedestrian dead reckoning, walking direction, deep learning
Published in: 2023 IEEE/ION Position, Location and Navigation Symposium (PLANS)
April 24 - 27, 2023
Hyatt Regency Hotel
Monterey, CA
Pages: 691 - 699
Cite this article: Fu, Hanyuan, Bonis, Thomas, Renaudin, Valerie, Zhu, Ni, "A Computer Vision Approach for Pedestrian Walking Direction Estimation with Wearable Inertial Sensors: PatternNet," 2023 IEEE/ION Position, Location and Navigation Symposium (PLANS), Monterey, CA, April 2023, pp. 691-699. https://doi.org/10.1109/PLANS53410.2023.10140028
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In