Smartphone-Based Pedestrian NLOS Positioning Based on Acoustics and IMU Parameter Estimation

Hucheng Wang, Can Xue, Zhi Wang*, Lei Zhang, Xiaonan Luo, Xinheng Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)


This paper proposes an integrated positioning algorithm for mobile devices and achieves long-term and high-accuracy indoor pedestrian tracking under severe non-Line-of-Sight (NLOS) scenarios. The traditional fusion method for hand-held devices lacks zero-speed correction and cannot clear the accumulated error of the Pedestrian Dead Reckoning (PDR). Secondly, the PDR algorithm also requires user privacy data for high positioning accuracy. Hence, we propose a customized model with acoustic and PDR through self-updating parameters with two novel fusing strategies: Kalman Filter with Least-Square (KFLS) and Kalman Filter with Bayesian Parameter Estimation (KFBPE), which utilize numerical feedback and Bayesian distribution, respectively. Experiments with Huawei Mate 9 show that both methods above can effectively eliminate the outlier resulting from severe signal loss, regardless of hand-holding gestures, with no individual privacy data required. Extensive experimental results demonstrate that the proposed methods are more efficient for NLOS and perform much better than the baselines of traditional fusion frameworks like the standard Kalman Filter. KFBPE has a relatively smoother tracking result, which guarantees an average positioning accuracy of up to 25 cm under the circumstance of nearly thirty percent acoustic signal loss (or NLOS) at the same time.

Original languageEnglish
Pages (from-to)23095-23108
Number of pages14
JournalIEEE Sensors Journal
Issue number23
Publication statusPublished - 1 Dec 2022


  • Inertialmeasurement unit
  • non-line-of-sight
  • signal loss
  • step length


Dive into the research topics of 'Smartphone-Based Pedestrian NLOS Positioning Based on Acoustics and IMU Parameter Estimation'. Together they form a unique fingerprint.

Cite this