Previous Abstract Return to Session C3 Next Abstract

Session C3: Multisensor Integrated Systems and Sensor Fusion Technologies

Design of Integrated Navigation System using IMU and Multiple Ranges from In-Flight Rotating Hexacopter System
Byungjin Lee, Gwangsoo Park, Kangsoo Ryu, Young Jae Lee, SangKyung Sung, Konkuk University, Republic of Korea
Location: Windjammer

This study introduces a method to integrate INS and multi range sensors for navigation system of unmanned aerial vehicles. The purpose of this study is an algorithm implementation and performance verification of the embedded sensor module for an indoor as well as an outdoor navigation under GNSS signal outage.
To develop this system, a typical INS error model is derived with 10 elements as a state: 3-dementional position, velocity and quaternion attitude. Line-of-sight(LOS) distance values between range sensors and obstacles, where obstacles’ polygon plane model and their absolute geographic information is assumed to be known, are adapted to serve as the measurement of Kalman filter. Specifically, this paper proposes a novel filter formulation combining multiple range residual vectors in order to construct the observation model.
The LOS distance values are measured by the range sensors and its residual is computed using the proposed vector distance model and 3-dementional obstacle map. If there is no error in the state, these two values would be same. In other words, the state error could be corrected comparing the measured distance and the estimated distance. This study includes an extended Kalman filter to corrects the INS error, where the derived observation model is linearized by each state vector element.
Because of practical and economic reasons, the number of the sensors is limited and one of the common problems from this limitation is blind spots between the sensors. If the obstacles are present at these spots, the sensors won’t recognize the obstacles and it will causes navigation failures. To solve this problem, a rotating flight mode is developed in this study. With the range sensors fixed on the vehicle, the rotating motion makes these sensors continuously scan its surrounding area like a 2-dementional range sensor called ‘Lidar’. Since the vehicle for this study has a general form of multirotor, it is possible to freely rotate along the down axis in body frame during flight. The controller for this motion is developed using a quaternion based geometric method, and it is verified in simulation. The flight mode is named ‘cyclic heading’ in this study, and it could help implement a ‘flying long-range Lidar’ at reasonable prices.
To verify this method, this study also includes the results of a flight simulation and indoor experiments. The flight simulation is composed with 5 steps. The first step is guidance. The guidance algorithm provides the target position along predefined flight paths. The next step is controller. This controller consists of position controller, attitude controller and an inverse dynamic model. The position controller compares the target position by the guidance and the estimated position to compute a required total force. The attitude controller developed for the ‘cyclic heading flight’ calculates moment needed to control attitude to determine the direction of the total force. The inverse dynamic model provides input values of actuators on the vehicle using the required total force and moment. Finally, the vehicle can be moved by these actuator forces. The third step is a dynamic model. This step calculates the motion by the actuator forces. At this step, all forces and moments due to the actuators or unintentional forces such as wind are added to become total force and total moment. The force and moment are applied to the vehicle dynamic model using Euler-Newton equation. The translational and rotational acceleration of this equation are used to propagate the ‘actual motion’ of the vehicle or to generate sensor outputs such as IMU or range sensor data. The forth step is sensor data generator. These sensor outputs depend on the above translational or rotational motion. IMU output is calculated from the acceleration of the dynamic model and the normal force of the earth’s gravity. Since the type of the IMU is strap-down MEMS IMU, the acceleration is converted to body frame. The error of the IMU is generated considering several general specifications such as ‘Random walk’ and ‘Bias instability’. This generated error is evaluated by ‘Allen variance’. It is assumed that the error of the range sensor has a white gaussian random characteristic. Since the variation of the distance error depends on the distance value, this change is applied to this error generation. Additionally, GNSS error is generated using actual measured GNSS data.
The last step of this simulation is navigation part. At this step, the proposed navigation algorithm is implemented. The coarse alignment of INS proceeds for the first 10 seconds, after which the integration algorithm is executed. To estimate the performance of this algorithm, the real motion from the dynamic model is compared with this result.
The indoor experiment has been done with a small onboard cart. In order to evaluate the performance of the proposed algorithm, a 6-axis low cost MEMS IMU was used. A small size single axis LiDAR was used as the multi range sensors. This sensor could provide up to 1081 horizontal range data for each rotation, with 40 rotations per one second. For the experiment result, however, only 7 and 28 range data was used and the update rate per second was reduced to 10 Hz for practical applications. With these limitations, a group of test results demonstrated that low cost range sensors with onboard microprocessor can provide the comparable performance for indoor UAV operation.



Previous Abstract Return to Session C3 Next Abstract