Performance Evaluation of 5G Positioning Methods Based on RTT
Jialun Li, Bo Zhang, Chao Sun, Yanhong Kou, Ying Xu, Jiayi Li, Lu Bai, Xin Wen, Yingzhe He, Beihang University
Location: Beacon A
As the importance of location services in modern society is increasingly prominent, the maturity and popularity of 5G positioning technology have brought unprecedented opportunities to the field of positioning. 5G not only inherits the advantages of traditional wireless positioning technology, but also significantly improves the positioning accuracy by introducing new technologies such as optimized time measurement, angle measurement, and hybrid measurement. The development of 5G positioning technology not only promotes the innovation of location-based services, but also greatly expands the application scope of location-based services with its high-precision characteristics, providing more accurate and reliable location information for various industries, showing great potential and profound impact in application scenarios such as smart cities, autonomous driving and the Internet of Things.
A variety of innovative methods have emerged in the research field of 5G positioning algorithms, which play an important role in evaluating system performance, improving computational efficiency, and providing accurate solutions, aiming to significantly improve positioning accuracy. Giuseppe et al. derived the Cramér-Rao lower bound of uncertainty for position and attitude angle estimation based on the assumption of millimeter wave signals, and evaluated the performance of position and attitude estimation using millimeter wave MIMO antennas in 5G systems. Caffery et al. explored the method of wireless location in code division multiple access networks, analyzed its performance, proposed techniques for measuring location parameters and algorithms for calculating location from these parameters, and analyzed various factors that affect the accuracy of location, including multipath propagation, non-line-of-sight propagation, and multi-user interference. Foy proposed a location positioning solution based on Taylor series estimation, which is innovative in using Taylor series expansion to approximate nonlinear functions, achieving higher accuracy and lower computational requirements in estimating location. L. Bai et al. improved the traditional single-time Taylor weighted least squares method and extended it to the TOA-AOD hybrid positioning scheme. Based on this, they proposed a joint positioning and synchronization method based on multiple times, using a time-dependent model to minimize the state variables related to the clock offset of the base station. C. Sun et al. proposed a 5G-GNSS hybrid positioning scheme based on AOA-TOA measurements, which combines AOA estimation from 5G base stations and TOA measurements from GNSS satellites for hybrid positioning, thereby alleviating the problem of insufficient visible satellites in urban environments in GNSS positioning. Chen et al. proposed an AOD estimation algorithm based on beam training, which can more accurately estimate the location of the User Equipment (UE) than the least squares estimation method for FD positioning. C. Sun et al. proposed a multi-rate adaptive Kalman filter that supports the fusion of GNSS and 5G measurements at different data rates, and proposed an active measurement uncertainty prediction algorithm to adaptively adjust the observation noise covariance matrix.
Although there has been preliminary exploration of RTT-based 5G positioning technology, current research often assumes an overly idealistic 5G RTT measurement error in simulation processes. In fact, due to various factors such as timing errors and NLOS errors, the error distribution of 5G RTT observations is complex, resulting in some deviations between existing research results and practical applications.
Therefore, this study analyzes and models various errors in positioning from a theoretical perspective, deeply explores the error distribution characteristics of 5G RTT measurements in real-world environments, and references the basic parameters set by the 5G standard in the 3GPP protocol. On the premise of implementing RTT positioning, the error model is added to the positioning method simulation model to test and analyze the impact of different errors on the RTT positioning method. By systematically analyzing the errors in 5G positioning, this study aims to provide a more accurate error model for timing errors in 5G RTT positioning technology, which will help to better understand the sources and characteristics of errors, as well as evaluate the impact of different environmental factors on positioning accuracy. The novelty and technical contributions are as follows:
? Research on 5G positioning technology based on PRS and SRS. Mainly focusing on the reference signal, observation quantity, physical layer link architecture, and high-level positioning process, including the resources, design, and configuration of reference signals.
? Research on the principles and technologies of wireless location in cellular networks. Mainly focusing on time-based measurement methods such as DL-TDOA and Multi-RTT, and simulation analysis of reference signals used for multi-base station location in 5G environments.
? Analyze and model the errors in the 5G RTT positioning method. Theoretically analyze various factors that affect the accuracy of RTT positioning, and express them using mathematical formulas. The main analysis is on timing errors caused by communication protocols and their impact.
? Simulate and validate 5G positioning methods. Set up typical application scenarios, add timing errors to the channel, and evaluate the user location estimation performance under typical scenarios. Compare the impact of the same error factor on user location estimation under different positioning methods, as well as the impact of different error factors on user location estimation under the Multi-RTT positioning method.
Preliminary research shows that the Multi-RTT method has significantly better positioning accuracy than DL-TDOA, both with and without the addition of timing errors of varying standard deviations. The RTT timing error, corrected using the timing error group, approximately follows a Gaussian distribution with N~(0.45,5.188). By changing the signal-to-noise ratio and the average distance from the base station to the user to explore the factors affecting positioning accuracy, the results show that increasing the signal-to-noise ratio can effectively reduce the packet loss rate within a certain range. Without adding timing errors, the average positioning accuracy can be improved by 46.5% when the signal-to-noise ratio is 35dB compared to 15dB. However, as the timing error increases, the improvement of the signal-to-noise ratio on positioning accuracy becomes more and more limited. When the distance is increased without adding error, the root mean square error of ranging and positioning both increase significantly. The average positioning accuracy at a distance of 150m is about 11.1% lower than that at a distance of 50m. However, when timing error is added, increasing the distance has no significant effect on positioning accuracy. This proves that the model proposed in this study can accurately reflect the error distribution characteristics of 5G RTT observations.