Abstract
The term “Augmented Reality” covers a wide range of applications, from the overlay of virtual graphics on a real scene using a head-mounted display for use in areas such as industrial maintenance, through to the insertion of real-time virtual graphics in TV programmes. Fundamental to all these applications is the need to be able to accurately track the motion of the camera, so that the graphics may be rendered so as to appear rigidly locked to the real world. To overcome the limitations of existing tracking systems, the MATRIS project has developed a real-time system for measuring the movement of a camera, which uses image analysis to track naturally occurring features in the scene, and data from an inertial sensor. No additional sensors, special markers, or camera mounts are required. This paper gives an overview of the system, provides the context for the other articles in this journal and presents some results.
Similar content being viewed by others
References
Bartczak, B., Koeser, K., Woelk, F., Koch, R.: Extraction of 3D freeform surfaces as visual landmarks for real-time tracking. J Real-Time Image Proc, this issue doi: 10.1007/s11554-007-0042-0
Billinghurst, M., Kato, H., Poupyrev, I.: The MagicBook: A Transitional AR Interface. Comput. Graph. 745–753 (2001)
Bleser, G., Wuest, H., Stricker, D.: Real-time vision-based tracking and reconstruction. J Real-Time Image Proc, this issue
Center for Machine Perception, Czech Technical University Prague: Omnidirectional Vision. http://cmp.felk.cvut.cz/demos/OmnidirectionalVision.html. Accessed 07 Sept 2007
Cornelius, H., Sara, R., Martinec, D., Pajdla, T., Chum, O., Matas, J.: Towards complete free-form reconstruction of complex 3D scenes from an unordered set of uncalibrated images. In: Statistical methods in video processing, ECCV 2004 Workshop, pp. 1–12. Springer, Heidelberg (2004)
Davison, A.J.: Real-time simultaneous localisation and mapping with a single camera. In: Proceedings of ICCV (2003)
Davison, A.J., Mayol, W., Murray, D.W.: Real-time localisation and mapping with wearable active vision. In: Proceedings of ISMAR (2003)
Felsberg, M., Hedborg, J.: Real-time view-based pose recognition and interpolation for tracking initialization. J Real-Time Image Proc, this issue
Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6), 381–395 (1981)
Frahm, J., Koeser, K., Grest, D., Koch, R.: Markerless augmented reality with light-source estimation for direct illumination, CVMP’05, London (2005)
Hol, J.D., Schön, T.B., Gustafsson, F., Slycke, P.J.: Sensor fusion for augmented reality. In: The 9th international conference on information fusion, Florence, Italy (2006)
Hol, J.D., Schön, T.B., Luinge, H., Slycke, P.J., Gustafsson, F.: Robust real-time tracking by fusing measurements from inertial and vision sensors. J Real-Time Image Proc, this issue
Koch, R., Pollefeys, M., van Gool, L.: Multi viewpoint stereo from uncalibrated video sequences. In: Proceedings of the ECCV’98, LNCS 1406. Springer, Heidelberg (1998)
Koeser, K., Bartczak, B., Koch, R.: Drift-free pose estimation with hemispherical cameras. Conference on visual media production (CVMP), London (2006)
Koeser, K., Bartczak, B., Koch, R.: Robust GPU-assisted camera tracking using free-form surface models. J Real-Time Image Proc, this issue
Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comp. Vis. 60(2), 91–110 (2004)
Luinge, H.J., Veltink, P.H.: Measuring orientation of human body segments using miniature gyroscopes and accelerometers. Med Biol Eng Comput 43 (2005)
Luinge, H.J., Veltink, P.H., Baten, C.T.M.: Ambulatory measurement of arm orientation. J. Biomech. 40, 78–85 (2007)
Micusik, B., Pajdla, T.: Estimation of omnidirectional camera model from epipolar geometry. IEEE conference on computer vision and pattern recognition (CVPR), Madison, USA (2003)
Nistér, D.: Automatic dense reconstruction from uncalibrated video sequences. PhD thesis, Kungl Tekniska Hogskolen (2001)
Nordlund, P.-J.: Sequential Monte Carlo filters and integrated navigation. Licentiate thesis, Linköping University, Thesis No. 945 (2002)
Orad. http://www.orad.tv. Accessed 07 Sept 2007
Pollefeys, M., Van Gool, L., Vergauwen, M., Verbiest, F., Cornelis, K., Tops, J., Koch, R.: Visual modeling with a hand-held camera. Int. J. Comput. Vis. 59(3), 207–232 (2004)
Roetenberg, D., Baten, C.T.M., Veltink, P.H.: Estimating body segment orientation by applying inertial and magnetic sensing near ferromagnetic materials’. IEEE Trans. Neural Syst. Rehabil. Eng. (2007)
Rolland, J.P., Davis, L.D., Baillot, Y.: A survey of tracking technology for virtual environments. Fundamentals of wearable computers and augmented reality. In: Barfield, Caudell (eds.) (Chap. 3). Mahwah, NJ, 67–112 (2001)
Scarmuzza D.: The omnidirectional camera calibration toolbox for matlab. http://asl.epfl.ch/~scaramuz/research/Davide_Scaramuzza_files/Research/OcamCalib_Tutorial.htm. Accessed 07 Sept 2007
Scaramuzza, D., Martinelli, A., Siegwart, R.: A flexible technique for accurate omnidirectional camera calibration and structure from motion. In: Proceedings of IEEE international conference of vision systems (ICVS’06), New York, 5–7 January (2006)
Schön, T.B., Gustafsson, F., Nordlund, P.-J.: Marginalized particle filters for mixed linear/nonlinear state-space models. IEEE Trans. Signal Process. 53(7), 2279–2289 (2005)
Skoglund, J., Felsberg, M.: Fast image processing using SSE2. In: Processing of the SSBA symposium on image analysis (2005)
Skoglund, J., Felsberg, M.: Evaluation of Subpixel tracking algorithms. In: Proceedings of international symposium of visual computation (ISVC) (2006)
Skoglund, J., Felsberg, M.: Covariance estimation of SAD matching. In: Proceedings SCIA 2007, Aalborg
Streckel, B., Koch, R.: Lens model selection for visual tracking. In: Proceedings of DAGM. Symposium’05, LNCS 3663. Springer, Heidelberg (2005)
Thomas, G.A.: Real-time camera tracking using sports pitch markings. J Real-Time Image Proc, this issue
Thomas, G.A., Jin, J., Niblett, T., Urquhart, C.: A versatile camera position measurement system for virtual reality TV production. In: Proceedings of IBC’97, pp. 284–289 (1997), http://www.bbc.co.uk/rd/pubs/papers/paper_05/paper_05.html
Tomasi, C., Kanade, T.: Detection and tracking of point features. Technical report, Carnegie Mellon University, April (1991)
Vacchetti, L., Lepetit, V., Fua, P.: Combining edge and texture information for real-time accurate 3D camera tracking. In: International symposium on mixed and augmented reality, pp. 48–56 (2004)
Vizrt. http://www.vizrt.com. Accessed 07 Sept 2007
Warner Brothers.: Special visual effects, A. I. Artificial Intelligence Bonus Disc, DVD (2002)
Wojdala, A.: Virtual studio in 2000 the state of the art. In: Virtual studios and virtual production conference, New York, pp. 17–18 August (2000)
You, S., Neumann, U., Azuma, R.: Orientation tracking for outdoor augmented reality registration. IEEE Comp. Graph. Appl., 19(6) (1999)
You, S., Neumann, U.: Fusion of vision and gyro tracking for robust augmented reality registration. In: Proceedings of IEEE conference on Virtual Reality (2001)
Acknowledgments
The MATRIS project was supported by the Information Society Technologies area of the EU’s 6th Framework programme—Project no. IST-002013.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Chandaria, J., Thomas, G.A. & Stricker, D. The MATRIS project: real-time markerless camera tracking for Augmented Reality and broadcast applications. J Real-Time Image Proc 2, 69–79 (2007). https://doi.org/10.1007/s11554-007-0043-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11554-007-0043-z