Abstract: | In the field of micro-aerial-vehicles, Positioning and navigation depends on the environment where they are located in. Inherently different techniques and different sensors combination are used in different environments for MAVs. MAVs often go through different environments in a flight task, such as indoor and outdoor, which cause a problem that what navigation method MAVs should adopt and where the different navigation methods MAVs should change. In order to ensure the continuity and reliability of autonomous mission, MAVs must have ability to detect the environment. This paper aims to propose an environment recognition method for MAVs’ adaptive navigation. MAVs have multiple navigation sensors, which are sensitive to environment, can provide environment facts to MAVs. However, the compute capability of MAVs’ hardware has been occupied by flight control. In order to achieve the requirement of adaptive navigation and ensure the safety of experiment, a two-level environment recognition method for MAVs using a smartphone has been proposed. Some sensors on smartphone are also sensitive to environment, such as position sensor (GPS), environment sensors (magnetometers, the barometer, the light intensity sensor) and image sensor (camera), which can be used for environment features extraction. Sensors’ measurements are preprocessed and fed into deep learning framework to get classification results including indoor phase, intermediate phase and outdoor phase of the MAV flight. The first level of recognition system uses position sensor and environment sensors as input of the framework, which is constructed by the integration of convolutional neural networks (CNN) and recurrent neural networks (RNN). It gives a high update rate of the environment recognition result to MAVs. The second level uses image sensor as input of the framework, which is constructed by convolutional neural networks. The second level is triggered when the first level cannot give satisfied result. A MAV-Smartphone combined (MSC) platform and an android app have developed in order to collect data from multiple sensors in different environments during the MAV flight. Furthermore, the two-level recognition system has been validated stepwise. The effectiveness of the proposed environment recognition method has been demonstrated. |
Published in: |
2018 IEEE/ION Position, Location and Navigation Symposium (PLANS) April 23 - 26, 2018 Hyatt Regency Hotel Monterey, CA |
Pages: | 1250 - 1255 |
Cite this article: | Wang, Yaning, Fu, Li, Wang, Lingling, Wang, Yandong, "An Environment Recognition Method for MAVs using a Smartphone," 2018 IEEE/ION Position, Location and Navigation Symposium (PLANS), Monterey, CA, April 2018, pp. 1250-1255. https://doi.org/10.1109/PLANS.2018.8373513 |
Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |