Skip to main content

VR/AR Input Devices and Tracking

  • Chapter
  • First Online:

Abstract

How do Virtual Reality (VR) and Augmented Reality (AR) systems recognize the actions of users? How does a VR or AR system know where the user is? How can a system track objects in their movement? What are proven input devices for VR and AR that increase immersion in virtual or augmented worlds? What are the technical possibilities and limitations? Based on fundamentals, which explain terms like degrees of freedom, accuracy, repetition rates, latency and calibration, methods are considered that are used for continuous tracking or monitoring of objects. Frequently used input devices are presented and discussed. Finally, examples of special methods such as finger and eye tracking are discussed.

Dedicated website for additional material: vr-ar-book.org

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Abawi FA, Bienwald J, Dörner R (2004) Accuracy in optical tracking with fiducial markers: an accuracy function for ARToolKit. In: Proceedings of the 3rd IEEE/ACM international symposium on mixed and augmented reality (ISMAR ‘04). IEEE Computer Society, Washington, DC, USA, pp 260–261. https://doi.org/10.1109/ISMAR.2004.8

    Chapter  Google Scholar 

  • Arulampalam MS, Maskell S, Gordon N, Clapp T (2002) A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans Signal Process 50(2):174–188

    Article  Google Scholar 

  • Bay H, Tuytelaars T, Van Gool L (2006) SURF: speeded up robust features. In: Computer vision–ECCV 2006. Springer, Berlin Heidelberg, pp 404–417

    Chapter  Google Scholar 

  • Berry R, Billinghurst M, Cheok AD, Geiger C, Grimm P, Haller M, Kato H, Leyman R, Paelke V, Reimann C, Schmalstieg D, Thomas B (2002) The First IEEE international augmented reality toolkit workshop. IEEE Catalog Number 02EX632

    Google Scholar 

  • Bishop G, Allen D, Welch G (2001) Tracking: Beyond 15 minutes of thought, SIGGRAPH 2001, Course 11. http://www.cs.unc.edu/~tracker/media/pdf/SIGGRAPH2001_CoursePack_11.pdf. Accessed 18 March 2021

  • Bouzit M, Coiffet P, Burdea G (1993) The LRP Dextrous Hand Master. Proceedings of Virtual Reality Systems Fall ‘93, New York

    Google Scholar 

  • Bowman DA, Kruijff E, LaViola JJ, Poupyrev I (2004) 3D-user interfaces: theory and practice. Addison Wesley Longman Publishing Co., Inc., Redwood City

    Google Scholar 

  • CyberXR (2021) Cyber-XR Coalition: Immersive technology standards for accessibility, inclusion, ethics and safety. https://www.cyberxr.org, Accessed 18 Mar 2021

  • DeFanti IA, Sandin DJ (1977) Final report to the National Endowment of the Arts. US NEA R60–34-163, University of Illinois at Chicago Circle, Chicago, IL

    Google Scholar 

  • Duchowski A (2007) Eye tracking methodology: theory and practice. Springer, London

    MATH  Google Scholar 

  • Fernandes KJ, Raja V, Eyre J (2003) Cybersphere: The fully immersive spherical projection system. Communications of the ACM, 46(9), 141–146. ACM, New York

    Google Scholar 

  • Fischler MA, Bolles RC (1981) Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24(6), 381–395. ACM, New York

    Google Scholar 

  • GeoPose (2021) GeoPose Standards Working Group, https://www.ogc.org, Accessed 18 Mar 2021

  • Ginsberg CM, Maxwell D (1983) Graphical marionette. Proceedings of SIGGRAPH Computer Graphics 18(1):26–27

    Article  Google Scholar 

  • Goldstein H (1980) Classical mechanics. Addison-Wesley

    MATH  Google Scholar 

  • Hackenberg G, McCall R, Broll W (2011) Lightweight palm and finger tracking for real-time 3D gesture control. In Proceedings of IEEE Virtual Reality Symposium 2011 (IEEE VR 2011), pp. 19–26

    Google Scholar 

  • Hartley R, Zisserman A (2000) Multiple view geometry in computer vision. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  • Herling J, Broll W (2011) Markerless tracking for augmented reality. In: Handbook of augmented reality. Springer, New York, pp 255–272

    Chapter  Google Scholar 

  • Herout A, Zacharias M, Dubská M, Havel J (2012) Fractal marker fields: No more scale limitations for fiduciary markers. In IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 285–286. IEEE

    Google Scholar 

  • Hillebrand G, Bauer M, Achatz K, Klinker G (2006) Inverse kinematic infrared optical finger tracking. 9th International Conference on Humans and Computers (HC 2006). Key 1045432

    Google Scholar 

  • Hummel J, Wolff R, Dodiya J, Gerndt A, Kuhlen T (2012) Towards interacting with force-sensitive thin deformable virtual objects. Joint Virtual Reality Conference of ICAT – EGVE – EuroVR, 2012 Eurographics Association, pp. 17–20. https://doi.org/10.2312/EGVE/ JVRC12/017–020

  • Kato H, Billinghurst M (1999) Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR), pp. 85–94. IEEE

    Google Scholar 

  • Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 225–234, IEEE

    Google Scholar 

  • Klomann M, Englert M, Weber K, Grimm P, Jung Y (2018) Improving mobile MR applications using a cloud-based image segmentation approach with synthetic training data. In Proceedings of the 23rd International Conference on 3D Web Technology, Web3D 2018, pp. 4:1–4:7. ACM

    Google Scholar 

  • Köhler J, Pagani A, Stricker D (2010) Detection and identification techniques for markers used in computer vision visualization of large and unstructured data sets. In Applications in Geospatial Planning, Modeling and Engineering (IRTG 1131 Workshop)

    Google Scholar 

  • Kramer J, Leifer L (1989) The talking glove: An expressive and receptive ‘verbal’ communication aid for the deaf, deaf-blind, and non-vocal. Proceedings of the 3rd Annual Conference on Computer Technology, Special Education, Rehabilitation. California State University Press, Northridge

    Google Scholar 

  • Lee JC (2008) Hacking the Nintendo Wii remote. Pervasive Computing 7(3):39–45. https://doi.org/10.1109/MPRV.2008.53

    Article  Google Scholar 

  • Lin J, Wu Y, Huang TS (2000) Modeling the constraints of human hand motion. In Proceedings of the Workshop on Human Motion (HUMO ‘00), IEEE Computer Society, Washington, DC, USA

    Google Scholar 

  • Lowe DG (1999) Object recognition from local scale-invariant features. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Vol. 2, pp. 1150–1157. IEEE

    Google Scholar 

  • Lowe DG (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–110

    Article  Google Scholar 

  • Möhring M, Fröhlich B (2011) Effective manipulation of virtual objects within arm’s reach. 2011 IEEE Virtual Reality Conference, 2011, pp. 131–138. IEEE. https://doi.org/10.1109/VR.2011.5759451

  • OpenAR (2021) Open AR Cloud. https://www.openarcloud.org/. Accessed 18 March 2021

  • Salisbury JK, Srinivasan MA (1997) Phantom-based haptic interaction with virtual objects. Computer Graphics and Applications. IEEE. https://doi.org/10.1109/MCG.1997.1626171

  • Schmalstieg D, Höllerer T (2016) Augmented reality: Principles and practice. Pearson

    Google Scholar 

  • Souman JL, Robuffo Giordano P, Schwaiger M, Frissen I, Thümmel T, Ulbrich H, Bülthoff HH, Erst MO (2008) Cyberwalk: Enabling unconstrained omnidirectional walking through virtual environments. ACM Transactions on Applied Perception. https://doi.org/10.1145/2043603.2043607

  • Szeliski R (2011) Computer vision - algorithms and applications, Springer. https://doi.org/10.1007/978-1-84,882-935-0

  • Zimmerman TG, Lanier J, Blanchard C, Bryson S, Harvill Y (1986) A hand gesture interface device. Proceedings of SIGCHI Bulletin, 17, SI(May 1987), 189–192

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paul Grimm .

Editor information

Editors and Affiliations

Recommended Readings

Recommended Readings

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Grimm, P., Broll, W., Herold, R., Hummel, J., Kruse, R. (2022). VR/AR Input Devices and Tracking. In: Doerner, R., Broll, W., Grimm, P., Jung, B. (eds) Virtual and Augmented Reality (VR/AR). Springer, Cham. https://doi.org/10.1007/978-3-030-79062-2_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-79062-2_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-79061-5

  • Online ISBN: 978-3-030-79062-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics