Loading [a11y]/accessibility-menu.js
iTracker: Towards Sustained Self-Tracking in Dynamic Feature Environment with Smartphones | IEEE Conference Publication | IEEE Xplore

iTracker: Towards Sustained Self-Tracking in Dynamic Feature Environment with Smartphones


Abstract:

Self-tracking at 6 degrees of freedom in real-time is essential in lots of emerging applications such as VR/AR/MR simulation, indoor navigation, and so on. With the devel...Show More

Abstract:

Self-tracking at 6 degrees of freedom in real-time is essential in lots of emerging applications such as VR/AR/MR simulation, indoor navigation, and so on. With the development of built-in sensors in smartphones, many self-tracking solutions have appeared. Many researchers try to utilize vision-based approaches combined with an Inertial Measure Unit (IMU) to realize self-tracking with smartphones. After testing these approaches, however, we find that tracking would be lost in four such common scenarios: 1) When the IMU rotates fast or for a long period of time, it will cause serious delays in orientation tracking; 2) The scenes where background features are not distinct enough; 3) When the smartphone moves fast, image features become quite different in successive frames; 4) Unstructured scenes where background features are not static. To address these issues, we propose iTracker, which utilizes Real-time Step-Length Adaption Algorithm to solve the scenario (1) and a Parallel-Multi-State Local Recovery method to deal with scenarios (2)-(4). Extensive experiments show that iTracker realizes robust and accurate self-tracking in these four scenarios with an error of 0.7% throughout the whole trajectory.
Date of Conference: 10-13 June 2019
Date Added to IEEE Xplore: 05 September 2019
ISBN Information:

ISSN Information:

Conference Location: Boston, MA, USA

References

References is not available for this document.