Skip to main content
Log in

Real-Time Camera Tracking for Mobile Devices: The VisiTrack System

  • Published:
Real-Time Systems Aims and scope Submit manuscript

Abstract

This paper presents VisiTrack, a novel approach for video based incremental tracking in real-time. The major objectives in the development of VisiTrack was to design or select algorithms that are well suited for embedded real-time computation. We had a special focus on latency reduction and storage minimization since the algorithms should run on mobile devices like PDAs with the appropriate extension, i.e. mainly a camera, in real-time. The image analysis, camera localization and feature position approximation of VisiTrack are explained in detail. The CV-SDF model, an extension of Synchronous Dataflow graphs (SDF), supporting the principles of linear processing and fine-grained pipelining was defined and applied for the design of all VisiTrack modules in order to fulfill real-time constraints and reduce system latency. Furthermore the camera localization and position approximation include mechanisms for minimization of errors that may arise for instance due to measurement inaccuracies. Current applications of VisiTrack in the augmented reality domain and robotic self localization show its good performance. However VisiTrack is not limited to these application domains.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Aste, M., Boninsegna, M., and Caprile, B. 1994. A fast straight line extractor for VisionGuided robot navigation. Technical report, Istituto per la Ricerca Scientifica e Tecnologica.

  • Beier, D., Billert, R., Brüderlin, B., Kleinjohann, B. and Stichling, D. 2003. Marker-less vision based tracking for mobile augmented reality. In ISMAR 2003, The Second International Symposium on Mixed and Augmented Reality, Tokyo, Japan.

  • Bhattacharyya, S., Murthy, P. and Lee, E. 1999. Synthesis of embedded software from synchronous dataflow specifications. Journal of VLSI Signal Processing 21:151–166.

    Google Scholar 

  • Chia, K. W., Cheok, A. D. and Prince, S. J. 2002. Online 6DOF augmented reality registration from natural features. In IEEE and ACM International Symposium on Mixed and Augmented Reality, Darmstadt, Germany.

  • Dipartimento di Ingegneria Aerospaziale — Politecnico di Milano. DIAPM RTAI—Realtime Application Interface. http://www.aero.polimi.it/~rtai/.

  • Fründ, J., Geiger, C., Grafe, M. and Kleinjohann, B. 2001. The augmented reality personal digital assistant. In The Second International Symposium on Mixed Reality, ISMR 2001, Yokohama, Japan.

  • Genc, Y., Riedel, S., Souvannavong, F., Akinlar, C. and Navba, N. 2002. Marker-less tracking for AR: A learning-based approach. In IEEE and ACM International Symposium on Mixed and Augmented Reality, Darmstadt, Germany.

  • Hartley, R. and Zisserman, A. 2002a. Multiple View Geometry (2002 edition). Chapt. Camera Geometry and Single View Geometry, Cambridge University Press, pp. 137–215.

  • Hartley, R. and Zisserman, A. 2002b. Multiple View Geometry (2002 edition). Chapt. Numerical Algorithms, Cambridge University Press, pp. 551–567.

  • Human Interface Technology Lab. AR-Toolkit. http://www.hitl.washington.edu.

  • Hylands, C., Lee, E. A., Liu, J., Liu, X., Neuendorffer, S., Xiong, Y. and Zheng, H. 2000. Ptolemy II: Heterogeneous concurrent modeling and design in java. Technical report, University of California at Berkeley.

  • Lee, E. 1993. Multidimensional streams rooted in dataflow. In IFIP Working Conference on Architectures and Compilation Techniques of Fine and Medium Grain Parallelisim, Orlando, Florida.

  • Lee, E. and Messerschmitt, D. 1987. Static scheduling of synchronous data flow programs for digital signal processing. IEEE Trans. on Computers 36(2).

  • Lucas, B. D. and Kanade, T. 1981. An iterative image registration technique with an application to stereo vision. In Proceedings of Image Understanding Workshop, pp. 121–130.

  • Park, J., You, S. and Neumann, U. 1998. Natural feature tracking for extendible robust augmented realities. In First IEEE International Workshop on Augmented Reality (IWAR98), USA.

  • Seitz, S. M. and Anandan, P. 1997. Implicit representation and scene reconstruction from probability density functions. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR97).

  • Shi, J. and Tomasi, C. 1994. Good features to track. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR94), Seattle.

  • Stichling, D. and Kleinjohann, B. 2002a. CV-SDF—A model for real-time computer vision applications. In WACV 2002: IEEE Workshop on Applications of Computer Vision, Orlando, Florida, USA.

  • Stichling, D. and Kleinjohann, B. 2002b. Low latency color segmentation on embedded real-time systems. In DIPES 2002: IFIP WCC 2002 Stream 7 on Distributed and Parallel Embedded Systems, Montreal, Canada.

  • Stichling, D. and Kleinjohann, B. 2003. Edge vectorization for embedded real-time systems using the CV-SDF model. In VI2003: 16th International Conference on Vision Interface, Halifax, Canada.

  • Welch, G. and Bishop, G. 1997. SCAAT: Incremental tracking with incomplete information. In Proceedings of Siggraph 97, pp. 333–344.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dirk Stichling.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Stichling, D., Esau, N., Kleinjohann, B. et al. Real-Time Camera Tracking for Mobile Devices: The VisiTrack System. Real-Time Syst 32, 279–305 (2006). https://doi.org/10.1007/s11241-005-4684-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11241-005-4684-3

Keywords

Navigation