Skip to main content
Log in

The MATRIS project: real-time markerless camera tracking for Augmented Reality and broadcast applications

  • Special Issue
  • Published:
Journal of Real-Time Image Processing Aims and scope Submit manuscript

Abstract

The term “Augmented Reality” covers a wide range of applications, from the overlay of virtual graphics on a real scene using a head-mounted display for use in areas such as industrial maintenance, through to the insertion of real-time virtual graphics in TV programmes. Fundamental to all these applications is the need to be able to accurately track the motion of the camera, so that the graphics may be rendered so as to appear rigidly locked to the real world. To overcome the limitations of existing tracking systems, the MATRIS project has developed a real-time system for measuring the movement of a camera, which uses image analysis to track naturally occurring features in the scene, and data from an inertial sensor. No additional sensors, special markers, or camera mounts are required. This paper gives an overview of the system, provides the context for the other articles in this journal and presents some results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Bartczak, B., Koeser, K., Woelk, F., Koch, R.: Extraction of 3D freeform surfaces as visual landmarks for real-time tracking. J Real-Time Image Proc, this issue doi: 10.1007/s11554-007-0042-0

  2. Billinghurst, M., Kato, H., Poupyrev, I.: The MagicBook: A Transitional AR Interface. Comput. Graph. 745–753 (2001)

  3. Bleser, G., Wuest, H., Stricker, D.: Real-time vision-based tracking and reconstruction. J Real-Time Image Proc, this issue

  4. Center for Machine Perception, Czech Technical University Prague: Omnidirectional Vision. http://cmp.felk.cvut.cz/demos/OmnidirectionalVision.html. Accessed 07 Sept 2007

  5. Cornelius, H., Sara, R., Martinec, D., Pajdla, T., Chum, O., Matas, J.: Towards complete free-form reconstruction of complex 3D scenes from an unordered set of uncalibrated images. In: Statistical methods in video processing, ECCV 2004 Workshop, pp. 1–12. Springer, Heidelberg (2004)

  6. Davison, A.J.: Real-time simultaneous localisation and mapping with a single camera. In: Proceedings of ICCV (2003)

  7. Davison, A.J., Mayol, W., Murray, D.W.: Real-time localisation and mapping with wearable active vision. In: Proceedings of ISMAR (2003)

  8. Felsberg, M., Hedborg, J.: Real-time view-based pose recognition and interpolation for tracking initialization. J Real-Time Image Proc, this issue

  9. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6), 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  10. Frahm, J., Koeser, K., Grest, D., Koch, R.: Markerless augmented reality with light-source estimation for direct illumination, CVMP’05, London (2005)

  11. Hol, J.D., Schön, T.B., Gustafsson, F., Slycke, P.J.: Sensor fusion for augmented reality. In: The 9th international conference on information fusion, Florence, Italy (2006)

  12. Hol, J.D., Schön, T.B., Luinge, H., Slycke, P.J., Gustafsson, F.: Robust real-time tracking by fusing measurements from inertial and vision sensors. J Real-Time Image Proc, this issue

  13. Koch, R., Pollefeys, M., van Gool, L.: Multi viewpoint stereo from uncalibrated video sequences. In: Proceedings of the ECCV’98, LNCS 1406. Springer, Heidelberg (1998)

  14. Koeser, K., Bartczak, B., Koch, R.: Drift-free pose estimation with hemispherical cameras. Conference on visual media production (CVMP), London (2006)

  15. Koeser, K., Bartczak, B., Koch, R.: Robust GPU-assisted camera tracking using free-form surface models. J Real-Time Image Proc, this issue

  16. Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comp. Vis. 60(2), 91–110 (2004)

    Article  Google Scholar 

  17. Luinge, H.J., Veltink, P.H.: Measuring orientation of human body segments using miniature gyroscopes and accelerometers. Med Biol Eng Comput 43 (2005)

  18. Luinge, H.J., Veltink, P.H., Baten, C.T.M.: Ambulatory measurement of arm orientation. J. Biomech. 40, 78–85 (2007)

    Article  Google Scholar 

  19. Micusik, B., Pajdla, T.: Estimation of omnidirectional camera model from epipolar geometry. IEEE conference on computer vision and pattern recognition (CVPR), Madison, USA (2003)

  20. Nistér, D.: Automatic dense reconstruction from uncalibrated video sequences. PhD thesis, Kungl Tekniska Hogskolen (2001)

  21. Nordlund, P.-J.: Sequential Monte Carlo filters and integrated navigation. Licentiate thesis, Linköping University, Thesis No. 945 (2002)

  22. Orad. http://www.orad.tv. Accessed 07 Sept 2007

  23. Pollefeys, M., Van Gool, L., Vergauwen, M., Verbiest, F., Cornelis, K., Tops, J., Koch, R.: Visual modeling with a hand-held camera. Int. J. Comput. Vis. 59(3), 207–232 (2004)

    Article  Google Scholar 

  24. Roetenberg, D., Baten, C.T.M., Veltink, P.H.: Estimating body segment orientation by applying inertial and magnetic sensing near ferromagnetic materials’. IEEE Trans. Neural Syst. Rehabil. Eng. (2007)

  25. Rolland, J.P., Davis, L.D., Baillot, Y.: A survey of tracking technology for virtual environments. Fundamentals of wearable computers and augmented reality. In: Barfield, Caudell (eds.) (Chap. 3). Mahwah, NJ, 67–112 (2001)

  26. Scarmuzza D.: The omnidirectional camera calibration toolbox for matlab. http://asl.epfl.ch/~scaramuz/research/Davide_Scaramuzza_files/Research/OcamCalib_Tutorial.htm. Accessed 07 Sept 2007

  27. Scaramuzza, D., Martinelli, A., Siegwart, R.: A flexible technique for accurate omnidirectional camera calibration and structure from motion. In: Proceedings of IEEE international conference of vision systems (ICVS’06), New York, 5–7 January (2006)

  28. Schön, T.B., Gustafsson, F., Nordlund, P.-J.: Marginalized particle filters for mixed linear/nonlinear state-space models. IEEE Trans. Signal Process. 53(7), 2279–2289 (2005)

    Article  MathSciNet  Google Scholar 

  29. Skoglund, J., Felsberg, M.: Fast image processing using SSE2. In: Processing of the SSBA symposium on image analysis (2005)

  30. Skoglund, J., Felsberg, M.: Evaluation of Subpixel tracking algorithms. In: Proceedings of international symposium of visual computation (ISVC) (2006)

  31. Skoglund, J., Felsberg, M.: Covariance estimation of SAD matching. In: Proceedings SCIA 2007, Aalborg

  32. Streckel, B., Koch, R.: Lens model selection for visual tracking. In: Proceedings of DAGM. Symposium’05, LNCS 3663. Springer, Heidelberg (2005)

  33. Thomas, G.A.: Real-time camera tracking using sports pitch markings. J Real-Time Image Proc, this issue

  34. Thomas, G.A., Jin, J., Niblett, T., Urquhart, C.: A versatile camera position measurement system for virtual reality TV production. In: Proceedings of IBC’97, pp. 284–289 (1997), http://www.bbc.co.uk/rd/pubs/papers/paper_05/paper_05.html

  35. Tomasi, C., Kanade, T.: Detection and tracking of point features. Technical report, Carnegie Mellon University, April (1991)

  36. Vacchetti, L., Lepetit, V., Fua, P.: Combining edge and texture information for real-time accurate 3D camera tracking. In: International symposium on mixed and augmented reality, pp. 48–56 (2004)

  37. Vizrt. http://www.vizrt.com. Accessed 07 Sept 2007

  38. Warner Brothers.: Special visual effects, A. I. Artificial Intelligence Bonus Disc, DVD (2002)

  39. Wojdala, A.: Virtual studio in 2000 the state of the art. In: Virtual studios and virtual production conference, New York, pp. 17–18 August (2000)

  40. You, S., Neumann, U., Azuma, R.: Orientation tracking for outdoor augmented reality registration. IEEE Comp. Graph. Appl., 19(6) (1999)

  41. You, S., Neumann, U.: Fusion of vision and gyro tracking for robust augmented reality registration. In: Proceedings of IEEE conference on Virtual Reality (2001)

Download references

Acknowledgments

The MATRIS project was supported by the Information Society Technologies area of the EU’s 6th Framework programme—Project no. IST-002013.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jigna Chandaria.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chandaria, J., Thomas, G.A. & Stricker, D. The MATRIS project: real-time markerless camera tracking for Augmented Reality and broadcast applications. J Real-Time Image Proc 2, 69–79 (2007). https://doi.org/10.1007/s11554-007-0043-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11554-007-0043-z

Keywords

Navigation