论文标题

Dynamicslics:利用人类锚定为无处不在的低空室内定位

DynamicSLAM: Leveraging Human Anchors for Ubiquitous Low-Overhead Indoor Localization

论文作者

Shokry, Ahmed, Elhamshary, Moustafa, Youssef, Moustafa

论文摘要

我们提出了Dynamicslic:一种室内定位技术,它消除了对艰巨的校准步骤的需求。 DynamicSlam是一个新颖的同时本地化和映射(SLAM)框架,它迭代地获取了环境的特征图,同时将用户同时与此映射进行了本地位置。具体来说,我们使用电话惯性传感器来跟踪用户的路径。为了补偿由于低成本惯性传感器而导致的误差积累,DynamicsLimics lam利用环境中的独特点(锚)作为观察值来减少估计的位置误差。 DynamicsLam介绍了基于与环境中其他用户的相遇的新型移动人类锚概念,从而大大增加了锚的数量和普遍性并提高了本地化的准确性。我们提出不同的遭遇模型,并显示如何将它们纳入统一的概率框架中,以减少用户位置的歧义。此外,我们提供了系统收敛性的理论证明,并为人类锚点重置累积误差的能力。使用不同的Android手机对Dynamicslic进行评估表明,它可以以中位数为11m提供定位精度。这种准确性的表现优于最先进的技术55%,强调了无处不在的室内定位的Dynamicslics有望。

We present DynamicSLAM: an indoor localization technique that eliminates the need for the daunting calibration step. DynamicSLAM is a novel Simultaneous Localization And Mapping (SLAM) framework that iteratively acquires the feature map of the environment while simultaneously localizing users relative to this map. Specifically, we employ the phone inertial sensors to keep track of the user's path. To compensate for the error accumulation due to the low-cost inertial sensors, DynamicSLAM leverages unique points in the environment (anchors) as observations to reduce the estimated location error. DynamicSLAM introduces the novel concept of mobile human anchors that are based on the encounters with other users in the environment, significantly increasing the number and ubiquity of anchors and boosting localization accuracy. We present different encounter models and show how they are incorporated in a unified probabilistic framework to reduce the ambiguity in the user location. Furthermore, we present a theoretical proof for system convergence and the human anchors ability to reset the accumulated error. Evaluation of DynamicSLAM using different Android phones shows that it can provide a localization accuracy with a median of 1.1m. This accuracy outperforms the state-of-the-art techniques by 55%, highlighting DynamicSLAM promise for ubiquitous indoor localization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源